AI ‘Family Emergency Scam’

Artificial Intelligence Becoming a Serious Threat to Reality

By Andi Hale

AI voice cloning scams have increased as technology has advanced. A recent elaborate scheme involves scammers using artificial intelligence to clone a person’s voice, which is then used to trick loved ones into sending money to cover a supposed emergency.  The disturbing trend is adding to mounting losses due to fraud. Americans lost nearly $9 billion to fraud last year alone – an increase of over 150% in just two years, according to the Federal Trade Commission. 

Fraudsters carrying out voice cloning scams will record a person’s voice or find an audio clip on social media or elsewhere on the internet. All they need is as little as 3 seconds, 10 seconds is even better to get a very realistic clone of your voice. The audio sample is then run through an AI program that replicates the voice, allowing the scammer to make it say whatever they type in addition to adding laughter, fear, and other emotions into the cloned voice depending on how the scam is scripted.

The goal of the scammer is to get you into fight or flight and to create urgency in your mind that your loved one is in some sort of trouble. To protect against voice cloning scams, experts recommend families adopt a “code word” system and always call a person back to verify the authenticity of the call. Additionally, they advise setting social media accounts to private, as publicly available information can be easily used against individuals.

Leave a Reply

Up ↑

Discover more from Digging Deeper Daily

Subscribe now to keep reading and get access to the full archive.

Continue reading