Vishing, or “voice phishing”, is a type of phishing attack that involves defrauding people over the phone by enticing them to divulge sensitive information. The goal is often to gain access to the victim’s bank account.
Generative AI tools can create image, video and voice replicas of real people saying and doing things they never would have done. And these tools are becoming increasingly easy to access and use. This can perpetuate intimate image abuse (including things like “revenge porn”) and disrupt democratic processes.
This is a common story: Your grandfather receives a call late at night from a person pretending to be you. The caller says that you are in jail or have been kidnapped and that they need money urgently to get you out of trouble. Perhaps they then bring on a fake police officer or kidnapper to heighten the tension. The money, of course, should be wired right away to an unfamiliar account at an unfamiliar bank.
Now, scammers are reportedly experimenting with a way to further heighten that panic by playing a simulated recording of “your” voice.
Here is another report of the same type of story. (‘Mom, these bad men have me’: She believes scammers cloned her daughter’s voice in a fake kidnapping)
Melanie Standiford of Midwest Media by Melanie shared this personal story.
As a child my mom thought of unique ways to protect my sisters and me. I grew up in Wellfleet, Nebraska which by all intents and purposes then – and now – should be one of the safest (and most boring) places on earth to live. And while that may be true, my mom was prepared. We could not randomly walk around town. After all, Highway 83 runs from Canada to Mexico in one way or another and she always said if we were kidnapped we could be halfway to Texas before anyone missed us. And that was in the 80’s, mind you.
And there was the lake. No swimming at the lake in Wellfleet where you can’t see your hand in front of you under water in the murky water. We felt over protected and bored. Now, all these years later, my mom is gone, but I still feel her love.
It was boring and that was before internet. Fast forward to 2024 and add in that and the high rate of human trafficking and scammers – what a mess.
I have a scary story to share with you from right here in Nebraska that happened this week.
One of my followers had to quickly decide how to respond when she got a call from a man with her own daughter’s voice being used during the call.
Brace yourself – this is not only a sophisticated artificial intelligence scam, it is TERRIFYING!
An AI SCAM CALL used the girl’s voice to call the mother to tell her mom she had been hurt in an accident.
Then, a man came on the phone to tell the mom he was a police officer. He informed the mom he was with the daughter. He told her the daughter had been hit by a car and was injured. He asked about other medical issues and informed the mom that they were at “Axes and Aces,” further making the mom believe the man on the phone.
During the call, the mom says she could “hear” her daughter crying and saying “I just want my mom!Please get my mom!”
Then the caller (the man on the phone) told the mom he was a member of a cartel in Nebraska and they had her daughter. He demanded money.
The mom went to the first place she could stop and borrowed a phone to call 911 and a sheriff’s deputy came to her.
The mom goes on to explain that the deputy was very helpful and they found her daughter at home, safely sleeping. But the details of knowing the familial connection, where the daughter worked, and having a recording of her daughter’s voice is very concerning.
The caller’s number was traced to Lexington and this has been reported to the FBI.
In another instance, several months ago, a good friend of mine messaged me asking to borrow money (which was out of character). I immediately knew it was a scam – until she FaceTimed me. It was her face talking TO ME asking to help her out. We talked to each other! Yet, it was NOT her. It was AI using her likeness.
Folks, the time has come where the future is now. All the fear of this is real. I’m not wanting to invoke hysteria- but conversations need to happen. Families need code words. Make a secret word with your kids or loved ones or something that only you know to indicate if it’s really you or not. Have conversations that will protect you and those you love. These scams are off the charts sophisticated and it is only going to get worse.
It’s sad we can’t randomly walk around Wellfleet, or trust a stranger on the phone. But we can’t.
Scam Watch
Knowledge is the first defense in these cases.

Deepfake technology is advanced enough to mimic the vernacular languages and dialects in English. Most people who are targeted are usually upper middle class, retired English speakers.

It is always a good idea to be safe, and ensure that one is not hoodwinked by scammers under any circumstances.


The FBI shared some tips on how to avoid getting scammed:
- Don’t post information about upcoming trips on social media – it gives the scammers a window to target your family. “If you’re up in the air, your mom can’t call to make sure you’re fine,” Johnson said.
- If you get such a call, buy yourself extra time to make a plan and alert law enforcement. “Write a note to someone else in the house and let them know what’s going on. Call someone,” Johnson said.
- If you’re in the middle of a virtual kidnapping and there’s someone else in the house, ask them to call 911 and urge the dispatcher to notify the FBI.
- Be wary of providing financial information to strangers over the phone. Virtual kidnappers often demand a ransom via wire transfer service, cryptocurrency or gift cards.
- Most importantly, don’t trust the voice you hear on the call. If you can’t reach a loved one, have a family member, friend or someone else in the room try to contact them for you.
- Create a family password. If someone calls and says they’ve kidnapped your child, you can tell them to ask the child for the password.
Discover more from Hale Multimedia LLC
Subscribe to get the latest posts sent to your email.
Leave a Reply