Wave of AI Voice Scams Washes Over the US

Wave of AI Voice Scams Washes Over the US

image provided by pixabay

This post is also available in: heעברית (Hebrew)

Imagine you’re getting a phone call- your child is on the phone, crying for help, and a voice demands a ransom. A frightening scenario, but in a new breed of scams rattling the US, the child’s voice is generated by AI, and the abduction is fake.

In a world where the line between reality and fiction is being gradually blurred by AI, cybercriminals have a cheap and effective technology they can misuse. AI voice cloning tools are widely available online and are used to steal from people by impersonating family members.

According to Tech Xplore, in a recent case, a mother from Arizona got a scam distress call and heard an uncanny imitation of her daughter’s voice in an apparent hostage situation. The ruse was over quickly when the mother contacted her daughter, but this case showed the authorities how potentially dangerous this phenomenon is.

Wasim Khaled, chief executive of Blackbird.AI, told AFP- “AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively.”

There is a wide array of apps, some of which are free, that can use a very short sample, easily stolen from content posted online, to create an AI voice clone.

In a global survey of 7,000 people from nine countries conducted by McAfee Labs, one in four people said they had experienced an AI voice cloning scam or knew someone who had. Seventy percent of the respondents were not confident they could tell the difference between a cloned voice and the real one.

“Because it is now easy to generate highly realistic voice clones… nearly anyone with any online presence is vulnerable to an attack,” Hany Farid, a professor at the UC Berkeley School of Information, told AFP. “These scams are gaining traction and spreading.”

Information provided by Tech Xplore.