
The most powerful people on the planet don’t quite know what to make of AI as it quickly becomes one of the most significant new technologies in history.
But criminals sure do.
In the six months since OpenAI first unleashed ChatGPT on the masses and ignited an artificial intelligence arms race with the potential to reshape history – a new strain of cybercriminals has been among the first to cash in.
These next-gen bandits come armed with sophisticated new tools and techniques to steal hundreds of thousands of dollars from people like you and me.
“I am seeing a highly concerning rise in criminals using advanced technology – AI-generated deepfakes and cloned voices – to perpetrate very devious schemes that are almost impossible to detect,” Haywood Talcove, CEO of LexisNexis Risk Solutions’ Government Group, a multinational information and analytics company based in Atlanta told me over Zoom.
“If you get a call in the middle of the night and it sounds exactly like your panicked child or grandchild saying, ‘help, I was in a car accident, the police found drugs in the car, and I need money to post bail (or for a retainer for a lawyer),’ it’s a scam,” Talcove explained.
Earlier this year, law enforcement officials in Canada say one man used AI-generated voices he likely cloned from social media profiles to con at least eight senior citizens out of $200,000 in just three days.
The what-if scenarios:Fear over AI dangers grows as some question if tools like ChatGPT will be used for evil
Similar scams preying on parents and grandparents are also popping up in nearly every state in America. This month, several Oregon school districts warned parents about a spate of fake kidnapping calls.
The calls come in from an unknown caller ID (though even cell phone numbers are easy to spoof these days). A voice comes on that sounds exactly like your loved one saying they’re in trouble. Then they get cut off, you hear a scream, and another voice comes on the line demanding ransom, or else.
The FBI, FTC, and even the NIH warn of similar scams targeting parents and grandparents across the United States. In the last few weeks, it’s happened in Arizona, Illinois, New York, New Jersey, California, Washington, Florida, Texas, Ohio, Virginia, and many others.
An FBI special agent in Chicago told CNN that families in America lose an average of $11,000 in each fake-kidnapping scam.
Here’s what to do if you get that call
Talcove recommends having a family password that only you and your closest inner circle share. Don’t make it anything easily discovered online either – no names of pets, favorite bands, etc. Better yet, make it two or three words that you discuss and memorize. If you get a call that sounds like a loved one, ask them for the code word or phrase immediately.
If the caller pretends to be law enforcement, tell them you have a bad connection and will call them back. Ask the name of the facility they’re calling from (campus security, local jail, the FBI), and hang up (even though scammers will say just about anything to get you to stay on the line). If you can’t reach your loved one, look up the phone number of that facility or call your local law enforcement and tell them what’s going on.
Remember, these criminals use fear, panic, and other proven tactics to get you to share personal information or send money. Usually, the caller wants you to wire money, transfer it directly via Zelle or Venmo, send cryptocurrency, or buy gift cards and give them the card numbers and PINs. These are all giant red flags.
Also, be more careful than ever about what information you put out into the world.
An FTC alert also suggests calling the person who supposedly contacted you to verify the story, “use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friend,” it says on its website.
Seeing it all unfold
“A criminal only needs three seconds of audio of your voice to ‘clone’ it,” Talcove warns. “Be very careful with social media. Consider making your accounts private. Don’t reveal the names of your family or even your dog. This is all information that a criminal armed with deepfake technology could use to fool you or your loved ones into a scam.”
Talcove shared a half dozen “how-to” video clips he says he pulled from the dark web showing these scams in action. He explained that criminals often sell information on how to create these deepfakes to other fraudsters.
“I keep my eyes on criminal networks and emerging tactics. We literally monitor social media and the dark web and infiltrate criminal groups,” he added. “It’s getting scary. For example, filters can be applied over Zoom to change somebody’s voice and appearance. A criminal who grabs just a few seconds of audio from your [social media feeds], for example, can clone your voice and tone.”
Fooling my relatives with a clone of my husband’s voice
I skipped all the organized crime parts and just Googled “AI voice clone.” I won’t say exactly which tool I used, but it took me less than ten minutes to upload 30 seconds of my husband’s voice from a video saved on my smartphone to an AI audio generator online, for free. I typed in a few funny lines I wanted “him” to say, saved it on my laptop, and texted it to our family. The most challenging part was transferring the original clip from a .mov to a .wav file (and that’s easy too).
It fooled his mom, my parents, and our children.
“We’re all vulnerable, but the most vulnerable among us are our parents and grandparents,” Talcove says. “99-in-100 people couldn’t detect a deepfake video or voice clone. But our parents and grandparents, categorically, are less familiar with this technology. They would never suspect that the voice on the phone, which sounds exactly like their child screaming for help during a kidnapping, might be completely artificial.”
News Source: USA TODAY