Social media is deeply broken. Bad actors are exploiting it while the platforms turn a blind eye. Children are suffering, and our leaders are not taking any serious steps to address the problem.
The House of Representatives recently concocted a legally dubious bill that would force the owners of TikTok to sell the company, which would be removed from app stores if the sale does not happen quickly. It is a well-intentioned, bad bill that aims to achieve its objectives by stifling free speech and putting government intrusion on the scale of free enterprise. Nevertheless, the bill was passed with bipartisan support and signed by President Biden. And, unsurprisingly, it was quickly put before the courts, where it will continue to be tried for years to come.
There’s currently a bipartisan effort to repeal Section 230, a 1990s provision of the Communications Decency Act that very simply states that platforms are generally not liable for user-generated content. Congress is now trying to repeal Section 230 in order to pave the way for a nebulous future in which all platforms are liable for everything that is said on them.
We enjoy being able to comment on news articles. We enjoy sharing our opinions online with friends, family, and strangers. Repealing Section 230 would throw all of this into question, because suddenly platforms would be legally liable for what their users say. And if someone doesn’t like their users’ criticism, they could sue the platform for enabling them to voice that criticism.
This legal risk is too great for any company to bear and violates the First Amendment, which is intended to protect the healthy dialogue and debate that underpins our democracy.
There is no doubt that we should be wary of the Chinese Communist Party’s direct or indirect influence over addictive apps used by billions of people. There is equally no doubt that social media platforms should act responsibly and not use Section 230 exemptions to edit content and churn out dopamine-inducing content for the general public. Following the letter of the law while blatantly trampling on its spirit allows the worst elements to produce the most offensive content that violates all standards of decency and jeopardizes the good faith protections of Section 230, and the platforms know it.
Dozens of children have died completing silly TikTok challenges like blackout contests and eating laundry pods, and Facebook and Instagram algorithms designed to reinforce people’s biases are allegedly directing predators directly to child-oriented content.
Suicide is the second leading cause of death among young people ages 13 to 17. Research has shown that technology-based, visual, feedback-seeking social media such as Instagram directly impacts the depression behind suicide, even when people limit the time they spend on social platforms.
More fundamental change is needed before we legislate who owns what, who is legally responsible for what, and where it is said. Platforms should never be allowed to target children, make money from them, or exploit them with algorithms. Platforms should not treat children like a commodity to be sold to the highest bidder.
My new platform, Hedgehog, does not allow minors at all. Why shouldn’t other companies do the same? After all, they make billions of dollars a year targeting kids?
Many states have taken steps to limit children’s access to social media. But this runs into the same problem seen in the so-called NetChoice case recently heard by the Supreme Court: a single state can implement laws that essentially set national and even global standards. A patchwork approach won’t work, and it will open up more dangerous loopholes to exploit and make it harder to compete.
I believe in limited government, limited regulation, and free markets. As long as we have government and expect it to legislate, now is the time and the place. Congress must act to protect children with reasonable limits on age of use, quantity, advertising targeting, and exploitation of children by social media companies. Focus on real harms, not what is politically popular.
Because Congress is, once again, dead wrong, making election ads instead of actual policy, and while adults are arguing, algorithms are harming kids.
John Matze is the founder and former CEO of Parler, now Hedgehog.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.





