
AI Voice Clones Used to Extort Families in Virtual Kidnapping Scams
Criminals harvest audio from social media to create deepfake voices of loved ones, demanding ransom while victims hear their relatives screaming
A New Jersey woman received a terrifying call from her sister's phone number in early 2026. On the line was a man's voice threatening to kill her sister, while a woman cried in the background. The caller claimed to be recently released from jail and demanded money to get home—but the real shock came when the victim checked her sister's location via Find My iPhone. The phone pinged from her sister's apartment, suggesting she was safe. Despite this, the scammer had been convincing enough to extract a payment before suddenly disconnecting.
This was no ordinary extortion attempt. The woman's sister was the victim of an AI-assisted "virtual kidnapping" scam, a crime that has exploded across North America since late 2025.
## How the Scam Works
The mechanics are straightforward but deeply unsettling. Criminals harvest audio clips from social media platforms—Instagram, Snapchat, Facebook Reels, and YouTube videos—capturing victims' voices in everyday situations. This audio is then processed through AI tools like FraudGPT to generate realistic voice clones. Scammers script emotional performances: crying, screaming, pleading for help. When they call family members, the cloned voices sound nearly identical to the real thing.
In one documented case, a mother received a call from her college-aged daughter's number. The AI voice claimed the daughter had been in a car accident after rear-ending someone. In the background, the cloned voice of her daughter cried out that she was being taken. The scammer demanded a ransom, eventually negotiating down to $50,000. The mother verified her daughter was safe before making any payment and filed a police report—confirming the scheme's prevalence.
## Expanding Geographic Reach
King County, Washington has emerged as a hotspot. Multiple cases were reported within the Highline School District, with scammers deliberately targeting non-English speakers. These criminals demand payment through untraceable methods, primarily cryptocurrency, knowing that such transactions are nearly impossible to reverse.


