SELECT LANGUAGE BELOW

‘Nightmarish Scenario:” Creeps Can Produce AI-Generated Deepfake Nudes of Anyone with a Few Clicks

Online AI chatbot allows users to generate explicit nude photos of real people in just a few clicks, prompting experts to warn of looming 'nightmare scenario' .

a Recent research by wired revealed a disturbing trend on the messaging app Telegram. Dozens of AI-powered chatbots allow virtually anyone to create deepfake nude images and videos. These bots, reportedly used by an estimated 4 million people each month, can remove clothing from provided photos or generate explicit content depicting individuals engaging in sexual activity. It is said that there is.

Deepfake expert Henry Ajdar first discovered this underground world of Telegram chatbots four years ago, and has seen a rapid increase in users actively creating and sharing this type of content. expressed serious concerns. “These tools are the ones that are really ruining lives and creating very nightmarish scenarios, mainly for young girls and women, but they are still one of the biggest apps in the world on the superficial web. , it's really alarming that it's so easy to access and find in the world,'' Ajdel told Wired.

Celebrities such as Taylor Swift and Jenna Ortega have fallen victim to the rise of porn deepfakes, while teenage girls have recently been targeted, with some deepfake nude photos showing signs of 'sextortion'. There are also reports that it was used in incidents. Research even revealed that 40% of U.S. students report deepfakes circulating within their schools.

The proliferation of deepfake sites due to advancements in AI technology has led to increased scrutiny from lawmakers. In August, the San Francisco Prosecutor's Office filed charges against more than a dozen “undressing” websites. When contacted by wired The company did not respond to the explicit chatbot content on Telegram, but the bot and associated channels suddenly disappeared, even though the creators vowed to “create another bot” the next day. .

Emma Pickering, head of technology-based abuse and economic empowerment at UK-based domestic violence organization Refuge, said these fake images can cause trauma, humiliation, fear, embarrassment and shame. He emphasized that it can cause psychological harm. He noted that this type of abuse is becoming increasingly common in intimate partner relationships, but perpetrators are rarely held accountable.

read more wired here.

Lucas Nolan is a reporter for Breitbart News, covering free speech and online censorship issues.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News