Artificial intelligence has been in the news a whole lot this year. It’s sort of like cryptocurrency’s popularity in the press a year ago, and just like some crypto companies, A.I. can scam you out of your money. That’s what many people find out when they receive a phone call they think is from a loved one.  Guess what? It’s not. 

Here’s how the gig goes down. The phone rings. On the other line is a close relative in dire need of help. They ask you to send money fast because this is serious. The voice sounds just like them, and in many cases, victims send money to a criminal with a Venmo or bank account they’ve never heard of before. 

All it takes is 30 seconds of audio of your voice. If you have a video on Facebook or TikTok with half a minute of your voice, boom, it can be cloned and used by crooks. 

The Washington Post had a story about a Canadian family scammed by A. I voice cloning. The elderly parents received a call from a “lawyer” that their son had killed an American diplomat in a car accident. He was arrested, in jail, and needed cash for attorneys. Here’s how the crooks nailed their victim.  They made it seem like they turned the phone over to the “son,” who told his parents he loved them—appreciated them. And they needed money. The parents fell for it and sent over $15K through a Bitcoin terminal. 

Two months ago, Microsoft showed off a text-to-speech A.I. tool that needed just a three-second audio sample to clone someone’s voice. So technology is getting better and more dangerous. 

Things are moving fast in this space. Just two years ago, it would take multiple minutes of uninterrupted audio samples to clone a voice. 

Add comment