A lawyer has issued a warning about an elaborate AI voice cloning scam that tried to trick his own father into handing over $35,000.
The scammers pretended to be Jay Schuster, 34, and called his father, Frank, 70, convincing him that his son had been in a serious car accident, had been arrested and needed bail.
Horrified former lawyer Frank said he was convinced it was his “hysterical” son and was deeply traumatized by the scam.
Jay is running for Florida's 91st House District and believes a scammer created a fake audio from his 15-second TV campaign ad.
Provided by Jay Schuster/SWNS
Frank, also from Boca Raton, Florida, who was visiting his daughter in New York at the time, said:
“That was my son, Jay. He was hysterical, but I recognized his voice right away.
“He said he had been in an accident, suffered a broken nose and received 16 stitches, but was in police custody after testing positive for alcohol.
“He blamed it on the cough syrup he had taken earlier.”
On September 28, the imposter, posing as Jay, begged Frank not to tell anyone about the situation.
Shortly afterward, Jay received a phone call from a man who identified himself as “Mike Rivers,” who appeared to be a lawyer, and told him he needed to post bail of $35,000 cash to avoid several days in jail.
The fraud further escalated when “Rivers” instructed Frank to pay the deposit via a cryptocurrency machine. This was an unconventional request, and Frank's suspicions grew.
“When he told me to go to the Coinbase machine at Winn-Dixie, I was skeptical,” Frank says. “I couldn't understand how that was part of the legal process.”
Frank finally realized something was wrong after his daughter, Jay's twin sister Lauren, and a friend discovered that AI voice cloning scams were on the rise.
Eventually he hung up.
“It's shocking when you get a call like that,” Frank said.
“My son has worked so hard and I was beside myself thinking that it could ruin his career and campaign.”
Jay, who has lectured on such fraud cases as a lawyer, was shocked to find himself a target.
He speculated that the scammers had copied his voice from a recent campaign ad that aired on television a few days before the incident.
“I’ve been focused on AI and its impact on consumers, but nothing can prepare you for when it happens to you,” Jay says.
“They did their research. They didn't use my phone number, which is consistent with the story that I was in prison and didn't have access to a phone.”
Jay was stunned by the sophistication of the scam.
“All it takes is a few seconds of someone's voice,” he says.
“Technology is so advanced that they could have easily extracted my voice from a 15-second campaign ad.
“There is also other video footage of me online, so they may have used that to replicate my voice.”
Jay advocates for changes to AI regulations to prevent such scams from harming others.
“There are three key policy solutions we need,” he says. “First, AI companies must be held accountable if their products are misused.
“Second, companies should require authentication before duplicating someone’s voice. And third, AI-generated content should be watermarked, whether it’s a cloned voice or a fake video.” It needs to be easily detectable.”
If elected to the Florida House of Representatives, Jay plans to take action against the growing abuses of AI technology, including voice cloning fraud.
He aims to introduce legislation to hold AI companies accountable for abuses and ensure necessary safeguards are put in place, such as voice authentication and watermarking.
“We need clear regulations to stop this type of crime from happening,” Jay said. “This is not just a technology issue, it’s about protecting people from the trauma and financial harm that these scams can cause.
“I would like to see stricter requirements for AI developers to ensure their tools are not used maliciously.”
As AI technology rapidly advances, Jay and Frank hope their story serves as a warning to others to be on guard.
“This shows how important it is to stay calm and think things through carefully,” Frank points out. “If something doesn't make sense, we need to listen and ask questions. These scams are getting more sophisticated, but we can't let our guard down.”
