AI has Made Voice Cloning Dead Easy, Ushering in a New Wave of Scams
We examined reports of voice cloning scams, their effectiveness in defrauding victims, and the challenges they pose for cybersecurity.
A mother's worst nightmare began with a phone call. Jennifer DeStefano thought she was hearing her daughter's voice begging for help. But the desperate plea was fake, created by scammers using AI voice cloning technology to duplicate a person's voice from just minutes of audio samples. DeStefano told the U.S. Senate her harrowing story in April 2023.
DeStefano's case was not an isolated incident. Federal regulators are sounding the alarm about artificial intelligence cloning human voices with startling accuracy and enabling a new wave of phone scams that have already targeted millions of people. Drata examined news reports and other sources to explore real cases of voice cloning used in scams, why these scams work so effectively, and the challenges they present for cybersecurity.
"Until very recently, whenever you heard speech, the only place where it could have possibly come from is another person's mouth. You're always willing to trust it," Visar Berisha, associate dean of research and commercialization at Arizona State University's School of Engineering, said during an August 2024 roundtable discussion about AI voice cloning. "But that's changing now."
The technology is alarmingly accessible. For as little as $5 a month, anyone can create convincing voice replicas from just 20 seconds of audio pulled from social media or YouTube videos. The same technology that allows James Earl Jones' iconic Darth Vader voice to live on in Disney's "Obi-Wan Kenobi" TV series is being exploited by criminals to impersonate loved ones, compromise bank accounts, and orchestrate emergency scams.
While Jones signed off on the use of his iconic, archival voice, the legal landscape around AI voice cloning remains complex. Recent high-profile cases, including disputes between celebrities and tech companies over unauthorized voice replication, have highlighted the need for clearer regulations.
Voice Cloning is the New Frontier of Cybersecurity Threats
As these technologies become more accessible through everyday interactions with voice-driven tools such as Siri, Alexa, and Google, the scale of the threat is becoming more apparent.
More than a quarter of adults surveyed in the U.K. reported being targeted by AI voice scams in the past year, according to CNN. More troubling was that 46% of those surveyed weren't aware that these kinds of scams existed, and 8% said they would send money if requested by someone they thought was a friend or family member—even if the call seemed suspicious. According to research company Market.US, the technology that can create AI voice clones is 99% accurate, and a separate McAfee survey found that 77% of AI voice scam victims reported losing money.
"Some irresponsible companies let people create anyone's voice within seconds and with such good quality that it would trick people," Alex Serdiuk, CEO of Respeecher, told Marketplace. Respeecher is a Ukrainian AI company specializing in legitimate voice cloning applications—Respeecher worked on cloning Jones' Darth Vader voice.
The Federal Trade Commission launched a challenge in November 2023 to combat AI-enabled voice cloning, warning that while private companies are racing to develop AI capabilities, technology to prevent potential harm isn't keeping pace. The agency noted that the risks posed by voice cloning require more than technological solutions.
Berisha, the associate dean at ASU whose team won the FTC challenge, warned that voice cloning threats go beyond simple scams. Bad actors could use the technology to disrupt elections by creating fake robocalls from officials claiming polling places are closed or interfering with air traffic control communications.
Traditional defensive measures are struggling to keep up. While companies are developing AI systems to detect fake voices, Berisha said it's becoming a losing battle. "There's this cat and mouse game where the good guys are now developing detectors, but they're going to be outpaced very quickly because the number of bad guys working on different types of technology is so great."
Traditional voice verification systems rely on biometric markers such as unique characteristics in a person's voice pattern that are similar to a fingerprint and once considered reliable identifiers. But AI's ability to replicate these patterns has made safeguards vulnerable.
The Future of Voice Cloning and How to Protect Yourself
The threat has prompted some financial institutions to develop new safety protocols. Starling Bank recommended families establish a "safe phrase" (a simple code word that can verify a caller's identity). St. Louis Bank also suggested other strategies, including not answering unknown numbers and letting them go to voicemail, verifying information by calling back known numbers, and being suspicious of requests for untraceable payment methods like wire transfers, gift cards, or cryptocurrency.
But experts warn those measures may not be enough as the technology rapidly advances. The United States and European Union are racing to regulate the technology, while the United Kingdom has taken a more hands-off approach, according to Marketplace. Action Fraud, Britain's national reporting center for cybercrime, documented multiple cases of voice cloning scams in early 2024, including fraudsters impersonating celebrities to promote fake investment schemes.
"People regularly post content online which has recordings of their voice, without ever imagining it's making them more vulnerable to fraudsters," Lisa Grahame, chief information security officer at Starling Bank, said in a September 2024 press release.
Some companies are developing innovative solutions. Berisha's team created OriginStory, a new type of microphone that verifies a human speaker is producing recorded speech and then watermarks it as authentically human. They're continuing to develop the technology, hoping to launch it in the market eventually.
Until protective technologies like this become widespread, experts advise people to be vigilant about sharing voice recordings online. Market.US estimates the global AI voice cloning market to reach $25.6 billion by 2033, from $2.1 billion in 2023.