Voice Cloning Scams, A disturbing trend is on the rise in the US, where scammers use AI-generated voice recordings to fake kidnappings and demand ransom. Recent advancements in artificial intelligence have made it easier for scammers to replicate voices, leaving victims and authorities scrambling.
Voice Cloning Scams How it Works
Scammers use AI to create convincing audio recordings of family members, then demand ransom for their supposed safe return. They employ two primary methods:
- Collecting voice data from unknown calls
- Extracting audio from public videos on social media
Real-Life Nightmare
Arizona mother Jennifer DeStefano shared her harrowing experience with Congress. Scammers used AI to make her believe her daughter, Briana, had been kidnapped. DeStefano received a call from an unknown number, hearing her daughter sobbing on the line. A threatening man demanded a $1 million ransom, while Briana’s voice pleaded in the background. Thankfully, Briana was safe at home, unaware of the scam.
Expert Insights
Cybersecurity expert Beenu Arora warns, “The intent is to get the right data through your voice… and this is becoming a lot more prominent now.” As AI technology evolves, identifying real versus fake becomes increasingly challenging.
Protect Yourself
The National Institutes of Health (NIH) advises:
- Be cautious of calls demanding ransom from unfamiliar numbers
- Verify claims by contacting the supposed victim directly
Take Action
If you believe you’ve fallen victim to these scams, contact your local police department immediately.
Stay Vigilant
As AI development accelerates, staying informed and cautious is crucial. Remember, when faced with alarming messages, stop and think before acting.