Product Reviews

AI voice scams are on the rise — here’s how to protect yourself

A woman looking at her phone with a shocked and scared expression


Scammers have always adapted with the times, but AI voice cloning is pushing things to a new, and seriously unsettling level. With just a few seconds of audio, criminals can now create convincing replicas of someone’s voice, then use it to con people out of money, personal info, or worse.

And the scary part? These scams are getting harder to spot. In fact, my own mother-in-law was a victim. A scammer convinced her that my husband was kidnapped and the only way to free him was to wire thousands of money to their account.

Of course, he was safely at work, but they used his voice with what sounded like, “Mom, just pay them. Please.” And she stayed on the phone, driving around from various ATMs taking out cash. Luckily, I was at the gym at the time and my father-in-law got a hold of me (he was much more skeptical). I assured them that my husband (their son) was safe and this was a scam.

Leave a Reply

Your email address will not be published. Required fields are marked *