According to an unredacted version of the lawsuit released Tuesday, Snapchat targeted underage users even as employees were discussing internally how to deal with the crisis without causing panic. It says it did not properly warn users about the scale of the rampant “sextortion scheme.”
The new details were revealed in a complaint first filed last month by New Mexico Attorney General Raul Torrez.
A photo-sharing app popular with children is claimed to be a major platform for online sex offenders who coerce minors into sending graphic images and use them as blackmail.
According to the latest complaint, Snap received “approximately 10,000 sextortion reports from users each month,” according to internal data, a member of the company's trust and safety team said in a November 2022 email. said. Employees described the situation as “extremely alarming.”
According to the complaint, another employee said that while the data was “a lot,” it was “a small fraction of the abuse” that was actually occurring on the app because it was an “embarrassing issue” for users to report. I answered that it was likely.
“It's disappointing to see Snap employees raise so many red flags that continue to be ignored by executives,” Torrez said in a statement Tuesday.
The unredacted lawsuit also includes an internal Snap marketing brief sent in December 2022 that states that “sexting and sending nudes have become common practices” and that users It acknowledged that it could “result in disproportionate consequences and serious harm” to the public.
The document called on Snap to provide information about the risks to users “without creating fear in Snapchat users,” according to the complaint.
“We cannot tell our viewers not to send nudes. This approach is likely wasteful, 'tone-deaf' and unrealistic,” the document says. “However, if you do, (1) the photo does not show your face; (2) it does not show any tattoos, piercings, or other defining physical characteristics, etc.”
A Snap spokesperson said the app has “built-in safety guardrails” and “intentional design choices to make it difficult for strangers to discover minors on our service.” He said that it was designed according to the concept.
“We have evolved our safety mechanisms and policies, from leveraging advanced technology to detect and block specific activity, to unfriending suspicious accounts, to working with law enforcement and government agencies. “We continue to do so,” a spokesperson said in a statement. .
“We take great care in our work here, and it pains us when bad actors abuse our services,” the statement added.
Snapchat, known for messages that disappear after 24 hours, is one of several social media apps that have drawn the ire of lawmakers for failing to protect children online.
As reported by the Post, Snap breaks ranks with other social media companies to pass a bipartisan bill that would impose a legal “duty of care” on companies to ensure their apps don't incite child sexual abuse. Supported the Kids Online Safety Act. Other online harms.
In March 2022, a Snap consultant warned company employees that the “ephemeral nature of Snap” could lull young users into a “false sense of privacy.”
Elsewhere in the New Mexico complaint, a Snap executive sent an email to a colleague in 2022 claiming that the company does not allow children under 13 to use Snapchat. Regardless, they expressed concerns about their ability to “actually verify” a user's age.
“[T]”The app, like many other platforms, does not use an age verification system, so any child who knows how to enter a fake birthday can create an account,” the executive said. .
In August 2022, Snap employees discussed the importance of taking steps to “ensure that user reports of grooming and sextortion do not continue to leak.”
Other employees also responded to the email, with one stating that a particular user's account had received “75 different reports mentioning nudity, minors, and extortion since October 2021. However, the account was still active.''
At one point, a fed-up Snap employee vented that the app was “now being overrun by this sextortion.”
“We've been twiddling our thumbs and wringing our hands for the past year,” the employee said, according to the complaint.
Last December, the New Mexico Attorney General's Office sued Meta Inc., the parent company of Facebook and Instagram, for failing to protect children from sex offenders on the apps.
