Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks

Sound of Deception: AI-Driven Voice Cloning Unleashes New Wave of Scams in the US

As the infamous saying goes, "seeing is believing," but in the age of artificial intelligence, this old wisdom may be due for an update. "Hearing is believing" seems more apt as AI brings a new threat into reality - voice cloning scams. It's the

Steven Alber profile image
by Steven Alber
Sound of Deception: AI-Driven Voice Cloning Unleashes New Wave of Scams in the US

As the infamous saying goes, "seeing is believing," but in the age of artificial intelligence, this old wisdom may be due for an update. "Hearing is believing" seems more apt as AI brings a new threat into reality - voice cloning scams. It's the latest trick in the book of cybercriminals, leading to a terrifying blend of reality and fiction that's shaking the United States.

Consider the tale of Jennifer DeStefano, an Arizona mother who received a frantic phone call from her 'daughter.' The voice was eerily convincing, sending chills down her spine as it cried out, "Help me, mom, please help me." But this voice wasn't her child's; it was an artificially cloned voice mimicking her daughter's distinct speech patterns and tone. The sobbing plea was followed by a scammer's demand for a hefty $1 million ransom.

The boundaries of reality began to blur for DeStefano, who was utterly convinced that her daughter was in imminent danger. Thankfully, the scam was revealed as she was able to contact her daughter, but this horrifying incident marks the beginning of a new era of scams driven by AI technology.

AI voice cloning, thanks to its increasingly flawless imitation of human speech, has become a powerful tool for unscrupulous individuals. The internet offers a plethora of apps, most of them free, that allow for the creation of AI voices with just a few seconds of a real person's voice - a voice that can be easily lifted from content posted online.

These deceptive digital clones aren't just used for voicemails and voice texts; they can be employed as live voice changers during phone calls, morphing into different accents, genders, and even imitating the speech patterns of your loved ones. According to a global survey, 25% of respondents have experienced or know someone who has fallen victim to such AI voice cloning scams.

The 'grandparent scam' is increasingly common. In this case, scammers impersonate a grandchild in a distressing situation, often claiming to be in jail due to a car accident, desperately needing money. Elderly individuals, more vulnerable to such scams, often fall prey and are swindled out of their savings. The rise in these scams, made terrifyingly convincing by voice cloning, has led to heightened warnings from the US Federal Trade Commission.

Even high-profile figures aren't safe from this technology's misuse. A deepfake audio clip of actor Emma Watson reading Adolf Hitler's biography "Mein Kampf" was widely shared online, leading to criticism of AI startup ElevenLabs, the creator of the voice cloning tool used in the clip.

The advancements in AI are bringing us to a tipping point. As Gal Tal-Hochberg, Group Chief Technology Officer at Team8, puts it, "We’re fast approaching the point where you can’t trust the things that you see on the internet." It's clear that we need new technologies to help verify our digital interactions. Until then, a critical ear and an awareness of this rapidly evolving threat are our best defenses against the sound of deception.

Steven Alber profile image
by Steven Alber

Never Miss a Post – Subscribe Now

Sign up today to receive cutting-edge news and thought-provoking articles straight to your inbox

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More