San Francisco officials have filed a landmark lawsuit against a popular deepfake website that uses artificial intelligence to “undress” images of clothed women and girls.
The city attorney’s office is suing 16 of the most popular AI “undressing” sites, which the complaint says received a total of more than 200 million visits in the first half of 2024 alone.
These websites allow users to upload images of real-life clothed people, which AI then “undresses” and transforms into fake nude images.
“[I]Asking her out on a date is a waste of time. [our website] “To obtain nude photos of her,” one of the deepfake sites states, according to the lawsuit.
According to the lawsuit, the deepfake nudes are created without consent and are being used to blackmail, bully and extort women and girls in California and across the United States.
“This investigation delves into the internet’s darkest corners, and I am terrified for the women and girls who have had to endure this exploitation,” San Francisco City Attorney David Chiu said in a statement.
“This is a big, multifaceted problem that we as a society need to solve as quickly as possible,” he said.
According to the lawsuit, the “undressing” sites violate federal and state laws banning revenge porn, deepfake porn, and child pornography.
The lawsuit also alleged that the defendants violated California’s unfair competition law because “the harm to consumers substantially outweighs the benefits associated with these conduct.”
The City Attorney’s Office is seeking civil penalties and the takedown of deepfake websites, as well as measures to prevent site owners from creating deepfake porn in the future.
The lawsuit references a February 2024 incident. Five students expelled from California middle school After creating and sharing AI-generated nude images of 16 eighth-graders.
According to the lawsuit, victims of deepfake nudes said they “feel like they had no choice about what happened to them or what happened to their bodies.”
Another woman said she and her family lived in “despair and constant fear that at any moment these images could resurface and be seen by countless others.”
The lawsuit is the first to tackle deepfake nude generators head-on.
As the AI industry grows, deepfakes – images generated and manipulated by AI – are becoming more mainstream.
AI-generated images often spread misinformation quickly.
In 2023, a deepfake of Pope Francis wearing a white Balenciaga puffer jacket went viral, and many believed it to be real until news outlets refuted it.
But deepfakes are often malicious.
Fake images of nude children were reportedly appearing at the top of search results on search engines from Microsoft and Google. NBC News report from March.
Non-consensual deepfake nudes of celebrities like Taylor Swift are circulating on the web.
AI-generated nude images often lead to sex blackmail schemes, where victims are forced to pay money to prevent the fake images from being made public.
“We need to be clear that this is not innovation, this is sexual abuse,” Chiu said. “We all need to do our part to crack down on bad actors who use AI to exploit and abuse real people, including children.”
