The most powerful people on the planet don’t quite know what to do with AI as it is quickly becoming one of the most important new technologies in history.
But criminals sure do.
In the six months since OpenAI first released ChatGPT to the masses and ignited an AI arms race with the potential to reshape history – a new breed of cybercriminal has been among the first to take advantage.
Next generation bandits are coming armed with cutting edge new tools and technology to steal hundreds of thousands of dollars from people like you and me.
“I see a very worrying rise in criminals using advanced technology — AI-generated deepfakes and reproduced voices — to perpetrate highly deceptive schemes that are almost impossible to detect,” said Haywood Talkoff, CEO of government group LexisNexis Risk Solutions, a Atlanta-based multinational information and analytics company on Zoom.
AI-generated images are already fooling people:Why experts say it will only be hard to spot them.
Competition in cyberspace:Google is upping the ante on artificial intelligence to compete with ChatGPT. Here’s how search and Gmail will change.
“If you get a call in the middle of the night and it sounds just like your panicked son or grandson saying, ‘Help, I’ve been in a car accident, the police found drugs in the car, and I need money to post bail (or to hire a lawyer),’ it’s a scam,” Talkov explained.
Earlier this year, law enforcement officials in Canada said a man used AI-generated voices likely copied from social media profiles to scam eight seniors out of at least $200,000 in three days. Just.
Big scam:An old man was cheated out of millions. Could the bank have done more to prevent fraud?
What if scenarios:Fear of the dangers of AI is growing as some question whether tools like ChatGPT will be used for evil
Similar scams have surfaced on parents and grandparents in almost every state in America. This month, several Oregon school districts warned parents about a string of fake kidnapping calls.
The calls come from an unknown caller ID (although it’s easy to spoof cell phone numbers these days). A voice comes in that sounds exactly as if someone in your family is saying they’re in trouble. Then they are cut down, you hear a scream, and another voice comes on the stake demanding a ransom, or whatever.
The FBI, Federal Trade Commission, and even the National Institutes of Health are warning of similar scams targeting parents and grandparents across the United States. In the past few weeks, it has happened in Arizona, Illinois, New York, New Jersey, California, Washington, Florida, Texas, Ohio, Virginia, and many more.
A Chicago-based FBI special agent told CNN that families in America lose an average of $11,000 per fake kidnapping.
Here’s what to do if you get this call
Talkcove recommends having a family password that you and your inner circle share. Don’t make anything easy to discover online – no pet names, favorite bands, etc. Better yet, make it two or three words that you discuss and memorize. If you get a call that sounds like a family member, immediately ask them for the password or phrase.
If the caller pretends to be law enforcement, tell them you had a bad call and they’ll call back. Ask the name of the facility they’re calling from (campus security, local jail, FBI), and hang up (although scammers will say almost anything to convince you to stay on the line). If you can’t reach your loved one, find that facility’s phone number or call local law enforcement and tell them what’s going on.
What is ChatGPT?:Everything you need to know about OpenAI’s free AI article writer and how it works
Twitter’s new CEO:What do you know about Linda Iaccarino, Elon Musk’s choice?
Remember, these criminals use fear, panic, and other proven tactics to convince you to share personal information or send money. Typically, the caller wants you to wire or transfer money directly via Zelle or Venmo, send cryptocurrency, or buy gift cards and give them your card numbers and PINs. These are all giant red flags.
Also, be more careful than ever about the information you put out into the world.
The FTC alert also suggests contacting the person who supposedly contacted you to verify the story, “Use a phone number that you know is theirs. If you can’t reach your loved ones, try contacting them through another family member or their friend,” it states on its website. Internet.
See it all unfold
“A criminal needs only three seconds of your voice to ‘clone’ him,” warns Talkoff. “Be very careful with social media. Consider making your accounts private. Do not reveal the names of your family or even your dog. This is all information that a criminal armed with deepfakes technology could use to trick you or your loved ones into a scam.”
Talcove has shared half a dozen “how-to” videos that he says he pulled from the dark web to show these scams in action. He explained that criminals often sell information about how these deepfakes are created to other scammers.
“I keep my eyes on emerging criminal networks and tactics. We are literally monitoring social media and the dark web and infiltrating criminal groups.” “It’s scary. For example, filters can be applied to Zoom to change someone’s voice and appearance. A criminal who grabs just a few seconds of audio from [social media feeds]For example, reproducing your voice and tone.
I deceive my relatives with my husband’s voice
I skipped all the bits of organized crime and googled “artificial intelligence voice cloning”. I won’t say exactly which tool I used, but it took me less than ten minutes to upload 30 seconds of my husband’s voice from a video saved to my smartphone into a free online AI voice generator. I wrote a few funny lines that I wanted “him” to say, saved them to my laptop, and sent them to our family. The most challenging part was transferring the original clip from a .mov file to a .wav file (that’s easy too).
Jennifer Jolly’s AI Voice Generator Sample.
I cheated on his mother, my father, and our children.
“We are all vulnerable, but the most vulnerable among us are our parents and grandparents,” Talkoff says. “99 out of 100 people couldn’t spot a deepfake video or audio reproduction. But our parents and grandparents, emphatically, are not familiar with this technology. They would never suspect that the voice on the phone, which sounds exactly like their child’s cry for help during a kidnapping, It may be completely artificial.”
More Jennifer Jolie:
Jennifer Jolly He is an Emmy Award-winning consumer technology columnist. The views and opinions expressed in this column are those of the author and do not necessarily reflect the views and opinions of USA TODAY.
#voice #phone #scams #rise #Heres #avoid