SELECT LANGUAGE BELOW

Apple Fails to Prevent Spread of Child Pornography on iCloud

A 27-year-old child sexual abuse victim is suing Apple for more than $1.2 billion, accusing Apple of failing to implement its own system to detect and remove child pornography from its iCloud service. A lawsuit is being filed.

of new york times report Apple is facing a major lawsuit that could sue the tech giant for more than $1.2 billion in damages. The lawsuit, filed over the weekend in U.S. District Court in Northern California, alleges that Apple has not implemented its own system to identify, remove, and report child pornography (also known by the technical term “child”). It alleges that the company failed to protect victims of child sexual abuse. Sexual Abuse Materials (CSAM) are stored on the company's popular cloud storage product, the iCloud service.

The complainant, a 27-year-old woman from the Northeast who is using a pseudonym due to the sensitive nature of the case, was a victim of child sexual abuse from an early age. The abuse was carried out by a relative who photographed the act and shared it with others online. Women continue to receive notifications, sometimes dozens a day, from law enforcement informing them that individuals charged with child pornography-related crimes have been found in possession of illegal images.

In 2021, Apple announced a tool called NeuralHash. This will allow the company to scan for known child sexual abuse images by comparing individual digital signatures (hashes) to photos stored in users' iCloud accounts. However, the company quickly abandoned the system in the face of criticism from cybersecurity experts who warned that it created a backdoor to the iPhone and could allow government surveillance.

The lawsuit alleges that by introducing and then abandoning the NeuralHash system, Apple broke its promise to protect victims of child sexual abuse and allowed illegal content to spread on its platform. Masu. The lawsuit seeks to change Apple's practices and compensate a group of 2,680 potential victims who are eligible to join the lawsuit. Under the law, victims of child sexual abuse are entitled to at least $150,000 in damages, and if a jury finds Apple liable, total damages could exceed $1.2 billion. .

The lawsuit follows a lawsuit filed in August by a 9-year-old girl in North Carolina who was sent child sexual abuse videos through an iCloud link and encouraged to film and upload nude videos of herself. This is the second such lawsuit. Apple filed a motion to dismiss the North Carolina lawsuit, citing Section 230 of the Communications Decency Act, which provides technology companies with legal protections for content posted by third parties on their platforms.

A recent decision by the U.S. Court of Appeals for the Ninth Circuit held that the Section 230 shield applies only to content moderation and does not provide comprehensive liability protection, making these lawsuits The results could have significant implications for the technology industry. That has raised hopes among plaintiffs' lawyers that tech companies could be challenged in court over how they handle illegal content on their platforms.

Apple defended its practices and said it is committed to fighting predators that endanger children while maintaining the security and privacy of its users. The company has introduced safety tools to curb the spread of newly created illegal images. For example, the Messages app includes features that can warn children about adult content and report harmful content to Apple.

But critics argue that Apple is putting privacy and profit above the safety of child sexual abuse victims. For years, the company reported significantly less abusive content than its peers, capturing and reporting only a fraction of the content that Google and Facebook capture.

The plaintiffs in this case decided to sue Apple because they believed the company had given child sexual abuse victims false hope by introducing and then abandoning the NeuralHash system. As an iPhone user herself, she feels that Apple has chosen privacy and profit over the well-being of people like her who have suffered tremendously from the spread of illegal images.

read more of new york times here.

Lucas Nolan is a reporter for Breitbart News, covering free speech and online censorship issues.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News