I Regret To Inform You That You Might Need To Share a Safe Word With Your Family
AI is no longer some sci-fi, Big Brother-type stuff. Now, people can scam your loved ones with “your” voice in phone calls to trick them into giving thousands of dollars by making them believe that you’re in danger.
Scammers will use audio files of your voice and upload them into programs that can replicate how you sound and make “you” say anything they wish.
I recently saw this video, on TikTok, of a woman explaining how her 82-year-old grandma received a call and could hear “her” in the background “distraught, screaming, as if I had been kidnapped.” She went on to say that luckily her grandmother called the police and they told her to ring her granddaughter back, which she did and everything was thankfully fine.
Meanwhile, the scammers are hoping that their victims will panic and not call anyone, and instead give thousands away to ensure the safety of their family and friends. A report by NBC News shows how believable and accurate someone’s AI voice can be.
This is why having a safe word with those close to you can be good so that you can know if the call is genuine. This is in conjunction with a rise in “deepfake” videos, where people often use celebrities’ faces (usually women—take, for instance, Emma Watson) and put them onto the bodies of pornstars, or faked videos of Barack Obama and other important figures.
The Federal Trade Commission has issued a warning about the scams: “Don’t trust the voice. Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friends.”
If you are asked to provide cash, cryptocurrency, or card numbers in a call like this, it’s highly likely that it is a fraudulent call.
(featured image: SteveLuker/Getty Images)
Have a tip we should know? [email protected]