Blog
AI voice scams are on the rise — here’s how to protect yourself

Scammers have always adapted with the times, but AI voice cloning is pushing things to a new, and seriously unsettling level. With just a few seconds of audio, criminals can now create convincing replicas of someone’s voice, then use it to con people out of money, personal info, or worse.
And the scary part? These scams are getting harder to spot. In fact, my own mother-in-law was a victim. A scammer convinced her that my husband was kidnapped and the only way to free him was to wire thousands of money to their account.
Of course, he was safely at work, but they used his voice with what sounded like, “Mom, just pay them. Please.” And she stayed on the phone, driving around from various ATMs taking out cash. Luckily, I was at the gym at the time and my father-in-law got a hold of me (he was much more skeptical). I assured them that my husband (their son) was safe and this was a scam.
We were all shaken.
How AI voice scams actually work
Thanks to AI-powered voice synthesis, scammers don’t need much to mimic someone’s voice. A short video clip posted on social media, a voicemail or a podcast can be enough to feed a cloning model.
Once they’ve got the voice, scammers use it to make phone calls that sound like a real friend or family member asking for help, usually in some kind of urgent crisis.
In one recent report, researchers from the University of Wisconsin–Madison found that people could be tricked into believing a loved one had been kidnapped based solely on a fake voice.
Other scammers take it further, posing as celebrities or authority figures to pressure victims into clicking malicious links or sending over financial details.
These scams are happening regularly
This isn’t just theoretical. The FBI recently warned that AI “vishing” attacks, voice phishing, are being used to target public officials.
In one case, a voice made to sound like Senator Marco Rubio reportedly reached out to foreign ministers and U.S. contacts to dig for sensitive information.
Even secure messaging apps like Signal aren’t immune. That’s part of what makes these scams so dangerous: they’re low-effort, high-impact and tough to detect in the moment.
In the case with my family, my mother-in-law was unaware of the scams and would have done anything to make sure her son was safe. I believe we all would for a family member, and that’s what these scammers are preying on.
What to watch out for
The best defense starts with a healthy dose of skepticism. If you get a call that feels off, even if it sounds like someone you know, take a second to pause. Don’t respond based on fear or urgency.
This is easier said than done, but try to stay calm. Knowing that these scams are out there is half the battle.
The FBI recommends always verifying strange or urgent messages by contacting the person through a trusted method, like a known number or text thread. If the scammer orders you to “not hang up,” text your loved one or even get to a computer and contact them on social media.
And, listen closely: even advanced AI voices can sound just a little flat or too perfect. Awkward pauses, robotic intonation, or mismatched background noise can be subtle giveaways.
How to protect yourself and your family
You don’t need to be a cybersecurity expert to stay safe. A few simple habits can make a big difference:
Pause and verify. If you get a panicked call asking for money, don’t act immediately. Hang up and reach out to the person directly through a method you’ve used before.
Create a family code word. Having a shared phrase only your family knows can help confirm someone’s identity in an emergency.
Watch for manipulation. These scams are designed to make you feel scared, guilty or rushed. If you feel those emotions flaring up, it’s a red flag.
Limit public audio when possible. Know that anything you post online could potentially be used to clone your voice. If you’re a frequent speaker or content creator, it’s worth being aware.
What’s being done
On the research side, new tools are emerging to fight back. One example is ASRJam, a system that disrupts AI call bots using subtle audio interference that humans can’t detect but machines struggle to process. It’s early days, but promising.
Law enforcement is also encouraging victims to report scams to the FBI’s Internet Crime Complaint Center (IC3), which helps track patterns and issue alerts.
AI voice scams real, and they’re becoming creepier and more advanced. But with a little awareness and a few good habits, you can avoid getting duped.
Follow Tom’s Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.
More from Tom’s Guide
Back to Laptops