SELECT LANGUAGE BELOW

Teen girls file a lawsuit against Elon Musk’s xAI regarding sexual images of them created by the Grok chatbot.

Teen girls file a lawsuit against Elon Musk's xAI regarding sexual images of them created by the Grok chatbot.

Teenage Girls Sue xAI Over Grok Chatbot

Three teenage girls are taking legal action against Elon Musk’s xAI, claiming that the Grok chatbot enabled the creation of explicit images using their photos, which then circulated on social media.

The lawsuit, filed in the Northern District of California, reveals that one suspect arrested in December had gathered images and videos of over 18 girls, many from the same school, using Grok to “undress” minors.

These manipulated images ended up on platforms like Telegram and Discord. One example included an image from Instagram where a girl appeared without her blue bikini, contributing to the exchange of AI-generated sexual content among the perpetrators, the complaint notes.

Annika Martin, an attorney representing the plaintiffs, stated that the children’s school and family photos were transformed into abusive material by the AI tools from a multibillion-dollar company, which were then exploited by predators. She expressed that it seemed as though Musk and xAI designed Grok with the intention to generate sexual content without caring for the impacts on children and adults involved.

“These girls’ lives were shattered by a devastating loss of privacy and a deep sense of violation that no child should ever experience,” she added.

xAI has not provided a response to this matter as of now.

Back in January, Musk had commented on the backlash Grok received for generating explicit images, asserting, “I’m not aware of any images of naked minors that Grok has produced. Literally zero.” He also mentioned that the chatbot is programmed to avoid creating anything illegal or depicting real people in revealing clothing, though he acknowledged that hackers could interfere with the system.

In November, Grok introduced a “spicy” mode, allowing the creation of content deemed “Not Safe for Work,” often relating to sexual or violent imagery.

According to the lawsuit, one plaintiff, referred to as Jane Doe 1, received a direct message on Instagram in December alerting her to explicit photos online that had transformed her into sexualized poses and even generated videos showing her undressing completely.

One image was created from a photo of her at a homecoming event, and another appeared to be an altered version of her yearbook photo that made her look topless. She reported experiencing severe psychological distress, including nightmares, trouble eating and sleeping, and anxiety around attending school.

The complaint also mentioned that she received a link to a Discord server containing explicit images and videos of at least 18 other underage girls, several of whom she recognized from her school.

Authorities have begun a criminal investigation into the suspect arrested in December, though specific charges were not detailed. Two other minors involved in the lawsuit discovered that their images were also manipulated for abusive content.

The plaintiffs are seeking damages for violating child pornography laws, arguing that altering the photos of real children to create sexual content constitutes child pornography.

Moreover, the lawsuit claims that xAI intentionally allowed its chatbot to produce sexual images of minors to profit from its AI technology. The long-term effects of this incident could leave lasting psychological harm for the victims, potentially leading to a lifetime notification from the National Center for Missing and Exploited Children indicating that “the criminal defendants possessed, received, and distributed CSAM files depicting them.”

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News