SELECT LANGUAGE BELOW

AI speech clone is so real that makers say its ‘potential risks’ could prove too dangerous

The experts were left speechless.

Microsoft researchers have developed an artificial intelligence text-to-speech program with human-level reliability.

It’s so realistic that its creators are keeping the high-tech interface “purely a research project” and have not yet allowed it for public use.

Microsoft has announced a new text-to-speech tool that’s so realistic it’s not yet safe for the general public to use. OleCNX – stock.adobe.com

The program, called VALL-E 2, is the first AI voice program to achieve human-like speech. Microsoft AnnouncesIn other words, it is indistinguishable from human speech.

So far, more rudimentary developments can be detected as AI through subtle nuances in language.

Most notably, VALL-E 2 is said to “provide exceptional clarity to previously difficult sentences, including complex or repetitive phrases.” According to the software paper:.

High-performance AI voice clones have reached human levels. Garry Killian – stock.adobe.com

It can also perfectly recreate a sound after listening to just three seconds of it.

The program “outperforms previous systems in voice robustness, naturalness, and speaker similarity,” the researchers noted.

Its developers have good intentions and want it to be used medically and socially as an aid to people with aphasia and similar pathological disorders.

Specifically, the researchers boast that VALL-E 2 “has potential uses in education learning, entertainment, journalism, home-made content, accessibility features, interactive voice response systems, translation, chatbots, and more.”

But they are not unaware of the potential for abuse of such a powerful tool.

“Misuse of this model could pose potential risks, such as spoofing voice identification or impersonating a specific speaker.”

For this reason, “there are no plans to incorporate VALL-E 2 into products or release it to the public.”

A hyper-realistic AI, VALL-E 2 can replicate speech with human-level believability. Microsoft

Voice spoofing, the practice of creating a fake voice over the phone or other long-distance communications, is becoming a growing concern as AI programs become more readily available, and Apple cited it as a top concern amid a rise in phishing.

The targets are mainly elderly people, but some mothers have received fake calls claiming that their children have been kidnapped for ransom, believing the caller was their own child.

Experts like Lisa Palmer, a strategist at consulting firm AI Leaders, recommend that family members and loved ones create closely guarded, spoken passwords that can be shared over the phone in case of suspicion.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News