It’s a beautiful day outside. The sun is shining, birds are chirping, everything is going amazingly. You have not a care in the world as you laze around, either in bed or whatever people do to relax at home. Suddenly, the phone rings. Who could be calling you at this hour?? You pick up the phone, put it to the ear, and answer. On the other side, a voice comes through, and you immediately know that something is wrong. It’s a lawyer, calling you because your son had gotten into an accident, and killed a U.S diplomat. Now, of course, you know that this is obviously some kind of scam. After all, your son would tell you first thing if something of this caliber would happen. However, someone else comes on after the lawyer, and it sounds….just like your son! He sounds panicked, and explains everything. After that, how could this be some kind of scam?? Your son was on the other side, the words came from his voice…..right? A little while later, the lawyer calls back, and gives you specifics on the amount of money needed to pay for your son, and obviously, you pay it.
And just like that, the parents of Benjamin Perkin had been scammed out of 21,000$ in “legal fees.” Perkin told the Washington Post that the voice was “close enough for my parents to truly believe they did speak with me.” His parents had given this stranger 21,000$ in bitcoin, and had realized they’d been scammed only when Benjamin had called to check up on them later that day. This family isn’t the only one who’s been a victim to this crime. In Newfoundland, Canada, according to The Royal Newfoundland Constabulary, at least 8 senior citizens have been a victim to this crime during a 3 day period between Feb. 28 and March 2.
How could this be?? Their son’s voice was on the phone, how could they have gotten their son’s voice saying these specific things at that certain time? The answer is simple: AI. I assume we’ve all seen those videos with former presidents and politicians playing Minecraft or rating fast food restaurants, which is, of course, possible through the power of AI. Now, cybercriminals are now utilizing that very concept and turning it into scenarios that we see here with Perkin’s family. For something so complicated, such as replicating someone’s voice, you would think that a lot of effort and money would go into something like that. However, the problem with this is that replicating a voice is actually very cheap and easy to effectively do.
“You can clone someone’s voice, and given the ability to do that, it’s not at all surprising that somebody would do that for nefarious purposes,” says Jonathan Anderson, an associate professor at Memorial University with a focus in cybersecurity. “It’s going to be more effective, especially while people get used to the fact that deepfake voices are a thing, and they are easily obtainable and easily accessible.”
And so, as we enter this new age of public AI technology actually being useful, we must learn to be careful. According to the FTC, If you answer a phone call from an unknown number, let the caller speak first. Whoever is on the other end of the line could be recording snippets of your voice — and later using it to impersonate you in a very convincing manner. All it takes is 3 seconds of your voice to put into a voice cloning software, such as, for example, Vall-E (which isn’t yet open to the public, however there are many other replicas of if it), or VoiceLab by ElevenLabs. “If you made a TikTok video with your voice on it, that’s enough,” says Hany Farid, a digital forensics professor at UC Berkeley.
And so, make sure you guys are staying safe. If you think something smells fishy, do not hesitate. If money gets involved, especially large sums in weird forms, such as crypto or in specific bills, put down the phone and block the number. Senior citizens are especially susceptible to these kinds of scams, so look out for your grandparents too!