The voice as a padlock for personal data

MONTREAL — Though we’ve heard more about the pitfalls and potential dangers lately, artificial intelligence can also help protect individuals’ data.

Some financial institutions use speaker recognition, which means they authenticate customers to phone services based on the unique characteristics of their voice.

Desjardins implemented this technology in June 2021. Since then, 1.5 million members who previously consented to the process have been identified by voice in six million calls.

“The authentication process [traditionnel] was described as irritating by our members when they called, says Annie-Claude Jutras, Senior Director, Transformation of Customer Relations Centers at Desjardins. At the beginning of each call, it took several tens of seconds to complete all steps of the process. With voice authentication it is faster, more natural and more pleasant.”

This method is also more secure, says Ms. Jutras, because Desjardins uses a kind of passive authentication, where the computer system recognizes the voice during a simple conversation, based on a hundred parameters. “We model the voice based on these parameters, which give us a unique key in the form of an alphanumeric code for each member,” she explains. It’s like a fingerprint.

This way, the customer does not have to provide a passphrase to be identified, a type of authentication already used by scammers who would record someone’s voice without their knowledge and then impersonate them and gain access to personal information .

“This type of authentication is done in an automated system, Nuance Ms. Jutras. Training a synthetic voice or using an artificial intelligence engine to calibrate the voice perfectly is much more difficult in the context of a fluid discussion where there is an exchange of questions and answers between the member and the Desjardins employee.”

Laurent Charlin, senior academic at Mila, associate professor at HEC Montréal and holder of a Canada-CIFAR Chair in Artificial Intelligence, agrees.

“We humans can clearly recognize the voice of a conversation partner, but the system can be trained to distinguish very specific intonations,” he says.

“If some financial institutions have adopted this system, they must have ensured that it is better than the technology they were previously using,” he adds.

Fake isn’t as easy as you think

According to Patrick Cardinal, professor and director of the software engineering department at the École de technologie supérieure (ETS), technology has not yet made it possible to use a simple speech sample to simulate an impromptu conversation with the voice of others.

“The fact that authentication occurs during a conversation introduces uncertainty about the questions or the upcoming exchange, making it difficult to predict the answers. Also, getting something reliable would require a large amount of data to train with [l’intelligence artificielle]: Most people don’t have this data online,” he comments, referring to voice excerpts available on the Internet.

The situation is different for television, radio or cinema professionals, whose speech excerpts abound on the web.

“We’ve already seen videos of actors or politicians saying things they never said thanks to hyperfaking (deep fake),” underlines Laurent Charlin. There are also applications that, from a small snippet, manage to generate your voice by making you say what they want.

In case of doubt, it is always possible to fall back on traditional methods, such as identification questions, recalls Ms. Jutras.

After review by the Canadian Anti-Fraud Centre, no report has been made of artificial intelligence-related fraud involving voice cloning, specifically to access a victim’s bank accounts. “Nonetheless, the CAFC will continue to monitor this closely,” Jeff Horncastle, Acting Client Outreach and Communications Officer, said via email.

———

This show was produced with financial support from the Meta Exchange and The Canadian Press for News.

Jillian Snider

Extreme problem solver. Professional web practitioner. Devoted pop culture enthusiast. Evil tv fan.

Leave a Reply

Your email address will not be published. Required fields are marked *