While it’s still open to question whether social media, which has created the most expansive public square in history, has been good for humanity, it has also given businesses and organizations an easy way to get their messages out, and made it easier than ever for like-minded people to connect and form social communities that span the globe.
The benefits to society have certainly been tempered by some negative effects. Scams, frauds, screen addiction, biased censorship, and other evils have been rampant in this medium at various times. There is one problem in particular that may worsen as a result of social media: child exploitation.
Over the past two years, numerous state governments have launched investigations into social media companies following widespread reports of inappropriate and sexual content being pushed onto children’s accounts.
“Meta has been pushing sexual content to all its users for some time now.”
One of the first recent investigations was conducted by the New Mexico Attorney General’s Office in late 2023. The investigation sought to examine how Meta platforms (Facebook and Instagram) served content to children.
Using test profiles posing as children (including both teens and pre-teens), state officials found that their accounts were being exposed to pornography and were also being targeted by online predators.
Reports say The VergeThe account, posing as a 13-year-old girl, somehow managed to amass 6,700 followers on Instagram, most of whom were adult men. About three to four times a week, the account receives messages containing “photos and videos of genitals, including exposed penises.”
The media outlet said the “dummy” account also reported a number of predatory accounts to the offending platform, but Meta determined that none had violated its community standards.
As a result of the investigation, the New Mexico Attorney General filed a lawsuit against Meta, alleging that “Facebook and Instagram platforms serve as breeding grounds for child predators to target children for human trafficking, distribution of sexual images, grooming and recruitment.”
“The lack of age verification allows teens and pre-teens to easily register unrestricted accounts. Once registered, Meta targets them with harmful and inappropriate content,” the lawsuit alleges. stated.
A Meta spokesperson responded by suggesting that, contrary to the evidence, the company’s technology and tools are sufficient to find and expose predators on its platform.
“We use advanced technology, employ child safety experts, report content to the National Center for Missing & Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” Mehta said.
Just a few days later, Kathleen Brandt of The Wall Street Journal published her own months-long investigation into similar activity on the Meta platform. She wrote: Marketplace Technology She noted how quickly explicit content appeared on her test account.
Brandt said it was “pretty astonishing” how quickly Instagram’s algorithm began recommending sexual content to children as well as adults.
Not only that, but sexually explicit content is also being monetized, she noted.
Facebook’s algorithms reportedly recommended groups with explicitly offensive names like “Little girls” and “Beautiful boys” to Blunt’s dummy accounts. Some of the groups the researchers found were labeled “Incest.” When the researchers reported the groups internally to Meta, the social media giant reportedly said the groups did not violate its community standards.
Brandt claimed that Meta has formed an internal task force to focus on these issues and manually remove “problematic” accounts at scale. The company reiterated its response to the New Mexico investigation that it has used technology to help solve the problem by determining how much users are viewing certain groups or engaging with children’s accounts. Meta also claimed to be removing hashtags related to pedophilia.
But Mehta’s fight to control predators is complicated by an entirely different issue: parents profiting from sexual content featuring their children.
Be in trouble, The Wall Street Journal’s February 2024 reportMeta has knowingly allowed parents to profit from this despicable practice.
According to the report, certain “parentally-controlled minor accounts” were allegedly selling material to adult male audiences, including photos of children in revealing clothing, exclusive chat sessions, and used clothing such as leotards and cheerleader costumes.
The report further alleges that Meta staff knew these parents were having sexually-charged conversations about their children, and sometimes even went so far as to get the parents to respond to sexually-charged messages sent to them by subscribers.
The New York Times The company conducted its own research into parent-run accounts and noted that they can make up to $3,000 per post. The report said branded posts on Instagram are only boosted by the platform’s algorithm, exposing the accounts to more predators.
These reports were so harsh on the social media giants that Democratic Senator Maggie Hassan wrote letters to TikTok, X, and Meta demanding that they reveal whether the platforms were monetizing the girls’ accounts and whether they knew that children were getting around age restrictions. Responded When contacted by The New York Times about the allegations, the company said:[s] We plan to use our monetization tools to identify accounts that may exhibit suspicious behavior and restrict their access to subscription content.”
Perhaps not surprisingly, these reassurances from Meta did little to ease people’s fears, and research into online child abuse activities continued. It didn’t take long for researchers to learn that little had changed, despite Meta’s promises.
The Wall Street Journal conducted a separate study to test Mehta’s response. A joint effort Working with Professor Laura Edelson from Northeastern University, we once again created a new test account posing as a 13-year-old child.
Their research found that within the first three to four minutes, Instagram began pushing adult sexual content to teens’ accounts, and it only took another 20 minutes for the algorithm to surface promotions for explicit adult creators in the feed, some of whom were selling nude photos.
The outlet reported that similar tests on other platforms, including Snapchat and TikTok, did not produce the same results for accounts held by minors.
“Even adult content on TikTok appears to be much less explicit than teen content. [Instagram]” Edelson reported.
In response, Mehta defended himself even more vigorously than he had against the previous accusations.
“This was an artificial experiment that doesn’t match the reality of how teenagers use Instagram,” spokesman Andy Stone said.
“As part of our long-standing work on youth issues, we have begun efforts to further reduce the amount of sensitive content teens may encounter on Instagram, and have significantly reduced that amount over the past few months,” he said.
Rick Lane, a tech policy expert and child safety advocate, told Blaze News that companies like Meta, Reddit, and Google have known about these dangers since their inception, and the situation is only getting worse.
Government intervention
In March 2023, before most of these reports had surfaced, the Utah state government banned anyone under the age of 18 from using social media without parental permission.
The Utah Social Media Regulation Act implemented restrictions such as age verification on social media, a ban on advertising to minors, and a social media curfew that bars minors from accessing sites between 10:30 p.m. and 6:30 a.m.
The law also requires social networks to give parents access to their teenage children’s accounts. CNN report.
In March 2024, Florida will follow suit. Banned Social Media For those under 14, 14 and 15 year olds would also be required to obtain parental permission to use the platform. As governments respond to ongoing threats, determining the appropriate government response may be difficult. Many observers believe it is time for governments to impose absolute age restrictions on social media, just as they do for nicotine, alcohol, etc.
“There will always be bad actors and predators on these social media sites, but Meta has long been pushing sexualized content to all of its users,” explained Return’s Peter Gietle.
“Instagram seems to be degenerating into softcore pornography for many users and I absolutely support a minimum age restriction, not just because of the sexual content but also because of the extremely harmful impact it can have on the mental health of pre-teens and teens,” he added.
Blaze TV Contributor and Mother Sara Gonzalez She was adamant that she would not allow her children to access social media, saying the dangers of these platforms were “clear” and that they were “not doing enough at all” to prevent sexual material reaching children.
“The real problem in this country is that parents are willing to give their impressionable young children access to such dangerous content for the sake of convenience,” she added.
But Lane said he didn’t believe the solution was to ban children from social networks, despite the reality of child sexual exploitation online, and called for sites to build age-appropriate features and safety measures into them from the start.
As long as children continue to be exploited and seduced online, calls for age-restriction measures will continue.
It’s time for a national discussion about the cost-benefit analysis of children’s use of social media. It’s hard to imagine the benefits it could bring to young people, given all the problems it has caused and the reality that previous generations have managed to get by just fine without it.





