SEARCH
SHARE IT
Itâs every parentâs worst nightmare. You get a call from an unknown number and on the other end of the line hear your child crying out for help. Then their âkidnapperâ comes on the line demanding a ransom or you will never see your son or daughter again. Unfortunately, this is not an imagined scenario from a Hollywood film.
Instead, itâs a terrifying example of the lengths that scammers can now go to in order extort money from their victims, co-opting new technology for nefarious purposes. It also shows the quality of AI voice cloning technology that is now convincing enough to trick even close family members. Fortunately, the more people know about these schemes and what to look out for, the less likely phone-based fraudsters are to make any money.
There are several key stages to a typical virtual kidnapping scam. Broadly speaking they are as follows:
If you fall for the scam, you will most likely be asked to pay in non-traceable way, like cryptocurrency.
There are variations on this theme. Most concerning is the potential for ChatGPT and other AI tools to supercharge virtual kidnapping by making it easier for fraudsters to find the ideal victims. Advertisers and marketers have for years been using âpropensity modellingâ techniques to get the right messages to the right people at the right time.
Generative AI (GenAI) could help scammers to do the same, by searching for those individuals most likely to pay up if exposed to a virtual kidnapping scam. They could also search for people within a specific geographical area, with public social media profiles and of a specific socio-economic background.
A second option would be to use a SIM swapping attack on the âkidnappeeâ to hijack their phone number ahead of the scam. This would add an unnerving legitimacy to the kidnapping phone call. Whereas DeStefano was eventually able to ascertain that her daughter was safe and well, and therefore hang up on her extortionists, this would be much harder to do if the victimâs relative is uncontactable.
Unfortunately, voice cloning technology is already worryingly convincing, as also our recent experiment proves. And it is increasingly accessible to scammers. An intelligence report from May warned of legitimate text-to-speech tools which could be abused, and a growing interest on the cybercrime underground in voice cloning-as-a-service (VCaaS). If the latter takes off it could democratize the ability to launch such attacks across the cybercrime economy, especially if used in combination with GenAI tools.
In fact, beside disinformation, deepfake technology is also being used for business email compromise and sextortion. We are only at the start of a long journey.
The good news is that a little knowledge can go a long way to diffusing the threat of deepfakes in general and virtual kidnapping in particular. There are things you can do today to minimize the chances of being selected as a victim and of falling for a scam call if one does occur.
Consider these high-level tips:
Virtual kidnapping is just the start. But stay up to date with the latest scams and you stand a good chance of nipping attacks in the bud before they cause serious emotional distress.
MORE NEWS FOR YOU