SELECT LANGUAGE BELOW

Meta’s decision to axe fact-checking system, adopt Musk-like policy is a big ‘win’ for free speech: Experts

Subscribe to Fox News to access this content

Plus, your account will give you exclusive access to select articles and other premium content for free.

Enter your email address[続行]By pressing , you agree to Fox News' Terms of Use and Privacy Policy, including notice of financial incentives.

Please enter a valid email address.

Meta's decision to lift content restrictions and replace its fact-checking program with a system similar to X's Community Notes has been hailed by experts as a major “victory” for free speech.

While some critics remain skeptical that Meta's reforms will lead to substantive change, MRC Free Speech America Vice President Dan Schneider told Fox News Digital that First Amendment supporters He said the news should be taken as a victory.

“Change is [Meta CEO Mark Zuckerberg] “The policies that have been implemented so far have replaced some of the most radical people in Silicon Valley with people like Joel Kaplan and Kevin Martin, who are second and third in their companies,” Schneider said. It was systematic and long-term.” These are big wins. ”

Chris Mattman, UCLA's chief data and artificial intelligence (AI) officer, said in an interview with FOX News Digital that Zuckerberg “should be celebrated,” and that he has been working on Facebook, Instagram, and threads.

The internet is abuzz with NYT headlines about fact checkers who have ruled that their meta-criticisms are “wrong'' and “beyond parody.''

Meta CEO Mark Zuckerberg announced Tuesday that the company is introducing a new fact-checking system similar to Elon Musk's community note on X. (Chris Unger/Zufa LLC/Jonathan Lahr/Nulfoto/Andrew Harnik/Getty Images)

“Without Elon [Musk] Buy Twitter, change the name to [Donald] Trump's election [this may not have happened]” he said.

But not everyone was thrilled with the news. Fact-checking organizations, liberal media commentators and other critics have scoffed at claims of political bias and suggested that Meta has abdicated its responsibility for content moderation. The New York Times even highlighted fact-checkers who balked at Mehta's claims.

Scott Baradel, author of Trust Signals: Brand Building in a Post-Truth World, equates Mehta's decision to referees being removed from the field and players still hoping to play fairly. did. He told Fox News Digital that the incident “raises serious questions about whether big tech companies are retreating from their responsibility to balance free speech with the need for public trust in the digital age.” “I'm doing it,” he said.

“Mark Zuckerberg's words are lofty, and he's certainly right that third-party fact-checking has issues with bias, but to be honest, he's probably the most resistant person in the wake of Trump's victory.” “I'm choosing the path with the least amount of problems.” continued.

Meta's third-party fact-checking program was introduced after the 2016 election and has been used primarily to “moderate content” and control misinformation on the platform due to “political pressure,” executives said. said, but admitted that the system had gone “too far.” “

Meta issues sweeping changes to restore free speech on Facebook, Instagram

app

Social media apps on iPhone home screen (Kurt “Cyber ​​Guy” Knutson)

The process has since drawn the ire of conservatives, who have accused the platform of politically-driven censorship, citing several examples where content has been silenced. These include the New York Post's bombshell report about Hunter Biden's laptop, as well as specific content regarding the coronavirus, the latter of which Zuckerberg said was under pressure from the Biden White House. I admitted that it was a mistake.

“We went to independent third-party fact checkers,” Joel Kaplan, Meta's chief global affairs officer, told FOX News Digital in an interview Tuesday morning. “They can basically fact-check everything they see on the platform, so it's become clear that there's too much political bias in what they choose to fact-check.”

Mattman, who previously served as chief technology officer at NASA's Jet Propulsion Laboratory (JPL), said there is some credence to accusations of left-wing bias and inaccuracy among metafact checkers. But he said another takeaway was Zuckerberg's decision not to further downgrade certain information. Content that has been flagged or rated.

Kaplan told Fox News Digital that some of Meta's own content moderation policies, particularly those that are “too restrictive and don't allow enough discussion of sensitive topics like immigration, transgender issues, and gender,” Kaplan told Fox News Digital. '', he said, adding that he would change the regulations as he felt.

Elon Musk praises Zuckerberg's move; Facebook ends fact-checking on Instagram

Meta logo on mobile phone background

The Meta platform appears on a smartphone screen with the Meta logo in the background on August 9, 2024 in Chania, Greece. (Nicholas Cocobris/NurPhoto via Getty Images)

Kaplan also revealed that Meta currently uses an automated system that makes “too many mistakes” and removes content that “doesn't even violate our standards.” said.

Judah S. Engelmayer, CEO and president of HeraldPR, told FOX News Digital that issues with Meta and other major technology platforms, whether ongoing or resolved, are not covered by fact-checkers. said it is working with platforms to carry out censorship, sometimes based on personal opinions or ideological agendas.

“For example, the debate over whether the coronavirus originated in a Chinese lab should never have been censored just because some people thought it was offensive or politically sensitive.” she said.

“Determining whether a virus is deadly or whether we need a vaccine or masks will require scientific debate and evolving data, based on fact-checkers' understanding of what is best for the public. Silencing opposing and supporting views undermines free speech,” Engelmayer continued.

Facebook admits 'mistake' in censoring iconic Trump assassination attempt photo: 'This was a mistake'

A photo of the New York Times building and Mark Zuckerberg

The New York Times caused a stir when it featured fact-checkers disagreeing with comments by Meta CEO Mark Zuckerberg. (New York Times building photo courtesy of CAMERA | Zuckerberg photo courtesy of Kent Nishimura)

Mattman said the platform will become better as companies like Meta move to more of an “open systems mindset” and “shine a light” on their internal processes. Previously, the meta Limit content reach It has been given a low rating by a fact checker or a fact checker that contains specific keywords.

Mattmann said that by moving to an approach similar to Community Notes, platform users will see more content and understand more about why review decisions are made, regardless of the “context” provided by fact checkers. I suggested that it would help me understand it more deeply.

The main contrast Mattman highlighted is Community Notes' “globally reviewable and transparent” approach, which allows readers to be part of the discussion about why a work was flagged and who flagged it. This means that you can see the section.

“The difference is [independent fact-checking organizations] You can see your profile in Community Notes. People who have community notes can look at the attribution and say, “These people edited this,” and they can go to X and look it up. So it's really based on open source thinking. And I actually think that's what ultimately wins out,” he said.

Still, Mattman said Meta could improve on X's approach by bringing more transparency to users.

CLICK HERE TO GET THE FOX NEWS APP

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News