SELECT LANGUAGE BELOW

West Virginia files a lawsuit against Apple regarding iCloud’s supposed involvement in the spread of child pornography.

West Virginia files a lawsuit against Apple regarding iCloud's supposed involvement in the spread of child pornography.

West Virginia Attorney General Sues Apple Over iCloud Child Pornography Claims

This Thursday, the Attorney General of West Virginia took legal action against Apple, alleging that the company’s iCloud service has become, as described in internal communications, “the largest platform for distributing child pornography.”

Republican Attorney General J.B. McCaskey expressed concerns that Apple seems to put user privacy ahead of the safety of children. Interestingly, his office reached out to inform me about this lawsuit, which marks the first instance of a government agency addressing the issue of child sexual abuse material on Apple’s platform.

In a statement, McCaskey emphasized the gravity of the situation, saying, “These images are a permanent record of a child’s trauma, and each time that material is shared or viewed, that child is re-victimized. This behavior is despicable, and Apple’s inaction is inexcusable.”

In response, Apple stated that it has implemented various features aimed at preventing children from sending or receiving explicit images and claimed that it continuously innovates to address evolving threats, ensuring a safe environment for children.

The company mentioned its industry-leading parental controls, including Communication Safety, which automatically intervenes on children’s devices when nudity is detected in Messages, Shared Photos, AirDrop, and even during live FaceTime calls. These measures, according to Apple, are designed with safety and privacy in mind.

Apple had considered the idea of scanning images but ultimately dropped it due to concerns about user privacy and the potential for misuse by governments for censorship or other reasons. A striking text message from Apple’s former anti-fraud chief in 2020 pointed out that Apple’s current priorities have led to it being described as “the largest platform for distributing child pornography.”

The lawsuit was filed in Mason County Circuit Court and seeks both statutory and punitive damages. It also requests that the court compel Apple to enhance its methods for detecting harmful content and to design more secure products.

In contrast, competitors like Google and Microsoft utilize a database provided by the National Center for Missing and Exploited Children to match uploaded photos against known identifiers of child sexual abuse material.

Prior to 2022, Apple adopted a different approach entirely, where files uploaded to iCloud were not scanned, and the absence of end-to-end encryption meant that law enforcement could access the data with a warrant. Plans for end-to-end encryption were reportedly abandoned after FBI complaints, highlighting tensions between privacy and law enforcement needs.

In August 2021, Apple introduced NeuralHash, a system aimed at balancing user privacy with the need to detect abusive material on devices before upload. However, it faced criticism over concerns that it could lead to false reports and might enable government surveillance.

After initially planning to roll out NeuralHash, Apple delayed its implementation until December 2022, around the same time it introduced an end-to-end encryption option for iCloud.

Critics within the state noted that NeuralHash falls short compared to other tools and can be bypassed easily. Apple has not employed measures to scan images uploaded to iCloud but introduced a feature called “Communication Safety” that blurs sensitive content on children’s devices.

Federal law mandates that U.S.-based tech companies report abuse cases to the National Center on Missing and Exploited Children. According to state records, Apple reported 267 incidents in 2023, while Google reported 1.47 million and Meta Platforms noted 30.6 million.

The allegations raised by the state reflect a similar class action lawsuit filed against Apple in federal court in California late last year. In that case, Apple aims to dismiss the lawsuit, citing immunity from liability under Section 230 of the Communications Decency Act, which broadly protects internet companies from legal action over user-generated content.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News