AI voice cloning scams on the rise, expert warns

Scammers are increasingly focusing on Artificial intelligence (AI) Duplicate the voice of a targeted individual on social media, make panic phone calls to family and friends, and convince unwitting callers to hand over money or access sensitive information tool.

Mike Schumack, chief innovation officer at identity theft prevention and credit score monitoring company IdentityIQ, told FOX Business, “AI has been around for a long time, and software companies have been working for some time to advance the technology. We’ve been using AI, and it started creeping into this kind of cybercrime space slowly, but all of a sudden in the last year or so it’s ramped up very quickly.”

“We’ve seen a lot about advanced phishing and targeted phishing, where AI is being used to generate very specific emails and very specific information about who the target is. “We have seen specific language being used,” he added. “AI voice cloning scams have also been on the rise over the past year, which is a very frightening topic.”

Scammers who run voice cloning scams record people’s voices or find audio clips on social media or elsewhere on the internet. “All you need is three seconds. Even better is 10 seconds to get a very realistic voice clone,” Schumack explained. The audio sample is then run through an AI program that replicates the audio and can be made to say whatever the scammer enters, or depending on the content of the scam script, the replicated audio may be filled with laughter, fear, or fear. You can add other emotions.

Generative AI tools lead to rise in deepfake fraud

IdentityIQ’s Mike Schumack warns that individuals and families should take precautions as AI-powered voice cloning scams are on the rise. (Photo by Annette Riedl/picture Alliance via Getty Images / Getty Images)

To demonstrate how sophisticated AI voice cloning programs are, IdentityIQ took audio samples from interviews conducted by the authors of this article. “Fox News Summary” Podcast this spring. They used that voice sample to create an AI voice clone of him making a panicked phone call to a family requesting money to be transferred to a cash app after a hypothetical car accident.

“Mom, I want to talk to you. I was scheduled to interview someone today for a novel I’m working on, but I got into a car accident. I’m okay, but I need help right now.” , I hit the bumper of another car. I need $1,000 to repair the damage or I’ll call the police and report it to my insurance. I need the money now. “I need your help. Can you send me $1,000, Zelle? I’ll show you how,” the voice clone said.

Schumack said voice-cloned calls from scammers are typically shorter than this example, and are used to relay requests for money, account access, or other requests by saying things like “I can’t talk right now” to potentially stealth the conversation. He pointed out that there is a possibility that they will try to terminate the conversation. Other Confidential Information.

“The scammer’s goal is to get you to fight or run away, to instill a sense of urgency in your mind that your loved one is in some kind of trouble. So the best way to deal with these situations is to “Hang up and immediately call your loved one to see if it’s them,” he explained.

What is artificial intelligence (AI)?

AI voice cloning

According to IdentityIQ, criminals only need a few seconds of an audio clip of someone speaking, often extracted from social media, and can clone an eerily similar voice. (FNC / Fox News)

Schumack cited a recent example of an interview IdentityIQ conducted with a person who received what appeared to be a panic call from his daughter at the camp, but was actually an AI-generated call from his daughter. It is said that it was a voice clone. The scammers found a post on social media about her daughter going camping and used that to make the call more realistic.

Fraudsters running AI voice scams also use AI programs to search the internet for information about individuals and businesses, including audio and video postings on social media and other locations, without the victim knowing. Schumack noted that he is looking for details that can be used to make a more convincing call. .

“The scary thing is, it’s not your neighbor doing this…This is a sophisticated organization, it’s not one person doing it. They’re doing research on social media, they’re collecting data on people. They’re not the same people who are trying to embed your voice. There’s someone else who’s trying to replicate your voice, the act of actually making a phone call. There’s someone else trying to do that and you know if the scam is working someone will come to the victim’s house and take the money. ”

UPS uses AI to stop ‘porch pirates’ from stealing packages

AI voice cloning

IDIQ’s Mike Schumack recommends that families be careful about what they post online and be wary of responding to panicked phone calls requesting urgently needed funds. ((Photo by: CHRIS DELMAS/AFP via Getty Images)/Getty Images)

Regarding steps individuals can take to avoid falling victim to AI voice cloning scams, Schumack said to be aware that what you post online is visible to the public and to avoid responding to emergency calls from strangers. He said he should think carefully. A number ostensibly from someone you know.

“Be careful about what you post online. That’s the first step,” Schumack said. “The second step is to be cautious in general if you get a call from a number you don’t know and it’s someone you like. If it’s a signal, it’s a red flag.”


Schumack encouraged families to consider implementing some type of password that encourages the use of different phrases that can be used to verify that the caller of any type of emergency is truly a member of the family.

Leave a Reply