Voice Password Failure

AI-powered voice cloning technology has advanced to the point where scammers can now easily create convincing fake audio of someone’s voice. In a proof-of-concept video, Watch how Joseph Cox from Vice demonstrates how he was able to access his bank account by simulating his voice using AI.

On the heels of a new way people can be scammed by AI, comes this incredible proof-of-concept video from Joseph Cox over at Vice. He demonstrates, with video, how he was able to access his bank account by simulating his voice using AI. What once sounded like a far-off security threat may be closer than we all think.

Rather than speak out loud, I clicked a file on my nearby laptop to play a sound clip: “check my balance,” my voice said. But this wasn't actually my voice. It was a synthetic clone I had made using readily available artificial intelligence technology.

“Okay,” the bank replied. It then asked me to enter or say my date of birth as the first piece of authentication. After typing that in, the bank said “please say, ‘my voice is my password.’” 

Again, I played a sound file from my computer. “My voice is my password,” the voice said. The bank's security system spent a few seconds authenticating the voice. 

“Thank you,” the bank said. I was in.

Vice

We are in a weird place with "AI" right now. In many ways, it is simply a next-word-prediction system. It isn't an all-knowing oracle or anything of the sort. It's uncited random bits of the internet regurgitated back to us to sound human. But... the tools these AI-powered systems can create do have uses.

Personally, I've used ChatGPT to help build some scripts I wanted to run on my computer. I've used it to help generate ideas for various trips I want to take. Many of the images I've used in the weekly newsletter have been generated by DALL-E. So there is a place for these systems. What Google and Microsoft are doing can be reckless with things at their current stage. But it is the AI-powered stuff we should be on the lookout for. Because if people and automated systems can be scammed by a simple voice clone, the ability to decipher fact from fiction is sure to get a hell of a lot more difficult.