The Rising Threat of AI Voice Hijacking in Cybersecurity
May 22, 2025

As generative AI tools grow more sophisticated, they’re giving rise to a dangerous new breed of scams called AI voice hijacking. According to an article by Help Net Security, with just three seconds of recorded speech, criminals can convincingly clone a person’s voice. These samples are often pulled from public social media posts or videos, making virtually anyone a potential target. What once seemed like a niche threat is now a growing risk not only for individuals but also for businesses, with potentially devastating financial and reputational consequences.
The article cites one example involving a mother who nearly fell victim to a virtual kidnapping scam when a cloned voice mimicked her daughter’s desperate cries for help. While such attacks have traditionally targeted the elderly, scammers are increasingly turning their attention to the corporate world. Business operations are particularly vulnerable when security protocols are lax and employees aren’t adequately trained to detect sophisticated fraud tactics.
The article quotes Visa’s North America Chief Risk Officer, Mike Lemberger, who warns these scams exploit gaps in authentication and employee awareness to create urgent, high-pressure scenarios, often convincing enough to manipulate staff into transferring funds or divulging sensitive information.
The financial sector is already feeling the effects. In one case, scammers impersonating Italy’s defense minister used AI voice deepfakes to defraud business leaders, prompting police to freeze nearly €1 million. Projections by Deloitte suggest that AI-enabled fraud losses in the U.S. could reach $40 billion by 2027, a figure that underscores how fast generative AI is outpacing legacy security measures like voice biometrics.
To counter this growing threat, the article suggests cybersecurity professionals push beyond traditional voice authentication. Integrating multi-factor authentication, AI-based fraud detection, and behavioral biometrics is now essential. Just as importantly, employee education must become a front-line defense. In an era where AI voice hijacking can mimic human voices with alarming realism, vigilance, layered defenses, and continuous training are critical to staying ahead of fraudsters.
Critical intelligence for general counsel
Stay on top of the latest news, solutions and best practices by reading Daily Updates from Today's General Counsel.
Daily Updates
Sign up for our free daily newsletter for the latest news and business legal developments.