Jennifer Jolly  |  Special to USA TODAY

play { // query dom only after user click if (!vdContainer) { vdContainer = document.getElementById(‘videoDetailsContainer’); vdShow = document.getElementById(‘vdt_show’), vdHide = document.getElementById(‘vdt_hide’); } vdContainer.hidden = !(vdContainer.hidden); // show/hide elements if (vdContainer.hidden) { vdShow.hidden = false; vdHide.hidden = true; } else { if (!flagCaption) { flagCaption = true; fireCaptionAnalytics() } vdShow.hidden = true; vdHide.hidden = false; } }); function fireCaptionAnalytics () { let analytics = document.getElementById(“pageAnalytics”); try { if (analytics) { analytics.fireEvent(`${ga_data.route.basePageType}|${section}|${subsection}|streamline|expandCaption`); } else { if (window.newrelic) window.newrelic.noticeError(‘page analytics tag not found’); } } catch (e) { if (window.newrelic) window.newrelic.noticeError(e); } } }()); ]]>

Deepfake detection: Have you been tricked by fake Obama?

Deepfakes are video manipulations that can make people say seemingly strange things. Barack Obama and Nicolas Cage have been featured in these videos.

Just the FAQs, USA TODAY

The most powerful people on the planet don’t quite know what to make of AI as it quickly becomes one of the most significant new technologies in history. 

But criminals sure do. 

In the six months since OpenAI first unleashed ChatGPT on the masses and ignited an artificial intelligence arms race with the potential to reshape history, a new strain of cybercriminals has been among the first to cash in. 

These next-gen bandits come armed with sophisticated new tools and techniques to steal hundreds of thousands of dollars from people like you and me. 

“I am seeing a highly concerning rise in criminals using advanced technology – AI-generated deepfakes and cloned voices – to perpetrate very devious schemes that are almost impossible to detect,” Haywood Talcove, CEO of LexisNexis Risk Solutions’ Government Group, a multinational information and analytics company based in Atlanta told me over Zoom. 

AI-generated images already fool people: Why experts say they’ll only get harder to detect.

Competition in cyberspace: Google ups the ante on AI to compete with ChatGPT. Here’s how search and Gmail will change.

“If you get a call in the middle of the night and it sounds exactly like your panicked child or grandchild saying, ‘Help, I was in a car accident, the police found drugs in the car, and I need money to post bail (or for a retainer for a lawyer),’ it’s a scam,” Talcove said. 

Earlier this year, law enforcement officials in Canada say one man used AI-generated voices he likely cloned from social media profiles to con at least eight senior citizens out of $200,000 in just three days. 

Senior scam: An elderly man was scammed out of millions. Could the bank have done more to prevent fraud?

The what-if scenarios: Fear over AI dangers grows as some question if tools like ChatGPT will be used for evil

Similar scams preying on parents and grandparents are popping up in nearly every state in America. This month, several Oregon school districts warned parents about a spate of fake kidnapping calls. 

The calls come in from an unknown caller ID (though even cellphone numbers are easy to spoof these days). A voice comes on that sounds exactly like your loved one saying they’re in trouble. Then they get cut off, you hear a scream, and another voice comes on the line demanding ransom − or else. 

The FBI, the Federal Trade Commission and even the National Institutes of Health warn of similar scams targeting parents and grandparents across the United States. In the past few weeks, it has happened in Arizona, Illinois, New York, New Jersey, California, Washington, Florida, Texas, Ohio and Virginia.

An FBI special agent in Chicago told CNN that families in America lose an average of $11,000 in each fake-kidnapping scam. 

Here’s what to do if you get that call

Talcove recommends having a family password that only you and your closest inner circle share. Don’t make it anything easily discovered online either – no names of pets or favorite bands. Better yet, make it two or three words you discuss and memorize. If you get a call that sounds like a loved one, ask them for the code word or phrase immediately. 

If the caller pretends to be law enforcement, tell them you have a bad connection and will call them back. Ask the name of the facility they’re calling from (campus security, local jail, the FBI), and hang up (even though scammers will say just about anything to get you to stay on the line). If you can’t reach your loved one, look up the phone number of that facility or call your local law enforcement and tell them what’s going on. 

What is ChatGPT?: Everything to know about OpenAI’s free AI essay writer and how it works

New Twitter CEO: What to know about Linda Yaccarino, Elon Musk’s pick

Remember, these criminals use fear, panic and other proven tactics to get you to share personal information or send money. Usually, the caller wants you to wire money, transfer it directly via Zelle or Venmo, send cryptocurrency, or buy gift cards and give them the card numbers and PINs. These are all giant red flags. 

Also, be more careful than ever about what information you put out into the world.

An FTC alert also suggests calling the person who supposedly contacted you to verify the story, “use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friend,” it says on its website. 

Seeing it all unfold

“A criminal only needs three seconds of audio of your voice to ‘clone’ it,” Talcove warns. “Be very careful with social media. Consider making your accounts private. Don’t reveal the names of your family or even your dog. This is all information that a criminal armed with deepfake technology could use to fool you or your loved ones into a scam.”

Talcove shared a half-dozen “how-to” video clips he says he pulled from the dark web showing these scams in action. He explained that criminals often sell information on how to create these deepfakes to other fraudsters. 

“I keep my eyes on criminal networks and emerging tactics. We literally monitor social media and the dark web and infiltrate criminal groups,” he said. “It’s getting scary. For example, filters can be applied over Zoom to change somebody’s voice and appearance. A criminal who grabs just a few seconds of audio from your (social media feeds), for example, can clone your voice and tone.” 

Fooling my relatives with a clone of my husband’s voice

I skipped all the organized crime parts and just Googled “AI voice clone.” I won’t say exactly which tool I used, but it took me less than 10 minutes to upload 30 seconds of my husband’s voice from a video saved on my smartphone to an AI audio generator online, for free. I typed in a few funny lines I wanted “him” to say, saved it on my laptop, and texted it to our family. The most challenging part was transferring the original clip from a .mov to a .wav file (and that’s easy, too). 

It fooled his mom, my parents and our children. 

“We’re all vulnerable, but the most vulnerable among us are our parents and grandparents,” Talcove says. “Ninety-nine in 100 people couldn’t detect a deepfake video or voice clone. But our parents and grandparents, categorically, are less familiar with this technology. They would never suspect that the voice on the phone, which sounds exactly like their child screaming for help during a kidnapping, might be completely artificial.”

More from Jennifer Jolly:

Jennifer Jolly is an Emmy Award-winning consumer tech columnist. The views and opinions expressed in this column are the author’s and do not necessarily reflect those of USA TODAY.

{ link.setAttribute(‘href’, url); }); } })(); function fireNavShareAnalytics (type) { try { let analytics = document.getElementById(“pageAnalytics”), section = ga_data.route.sectionName || ga_data.route.ssts.split(‘/’)[0]; if (analytics) { analytics.fireEvent(`${ga_data.route.basePageType}:${section}:nav-share-buttons:${type}`); } else { if (window.newrelic) window.newrelic.noticeError(‘page analytics tag not found’); } } catch (e) { if (window.newrelic) window.newrelic.noticeError(e); } } ]]>



No responses yet

Leave a Reply

Your email address will not be published. Required fields are marked *