New York CNN —
Among a number of sweeping changes that will drastically change how posts, videos, and other content are managed online, Meta will adjust Facebook and Instagram’s content review policies, eliminate fact checkers, and restrict user-generated ” Replace with “Community Notes”. ” CEO Mark Zuckerberg announced on Tuesday, similar to Elon Musk’s X.
The change came just before President-elect Donald Trump took office. Trump and other Republicans have accused Zuckerberg and Mehta of what they see as censorship of right-wing voices.
“Fact checkers are so politically biased that they’re destroying more trust than they’ve built,” Zuckerberg said in a video announcing the new policy on Tuesday. “What started as a movement to be more inclusive has gone too far, increasingly being used to shut down voices and shut out people who think differently.”
But Zuckerberg acknowledged the “trade-offs” of the new policy, noting that more harmful content would appear on the platform as a result of the content moderation changes.
Joel Kaplan, Meta’s newly appointed head of international affairs, told Fox on Tuesday that Meta’s partnership with third-party fact checkers was “initially well-intentioned, but “There is too much political bias in the method.”
The announcement comes amid a broader ideological shift to the right within Mr. Mehta’s leadership, as Mr. Zuckerberg seeks to improve relations with Mr. Trump before the next president takes office later this month. . Just the day before, Mehta announced that Trump ally and UFC CEO Dana White would be joining the company’s board along with two other new directors. Meta also plans to donate $1 million to President Trump’s inaugural fund, and Zuckerberg said he wants to play an “active role” in technology policy discussions.
Kaplan, a prominent Republican who was promoted to the company’s top policy director last week, acknowledged that Tuesday’s announcement was directly related to the administration change.
“There’s no question that things have changed over the past four years,” he said. We’ve seen a lot of social and political pressure towards more content, less censorship, but we have a real opportunity. Now we have a new administration, a new president who is a strong champion of freedom of expression, and that brings change. ”
The Real Facebook Oversight Committee, an external accountability group whose name is a play on the company’s official group, is made up of academics, lawyers, and civil rights activists, including early Facebook investor Roger McNamee, but the policy change is said that Meta represents becoming “full MAGA.”
“Meta’s announcement today is a retreat from a healthy and safe approach to content moderation,” the group said in a statement, calling the changes “political pandering.”
The moderation changes mark a stunning reversal in the way Meta handles false and misleading claims on its platform.
The company launched an independent fact-checking program in 2016 after allegations that it failed to prevent foreigners from using its platform to spread disinformation and sow discord among Americans. In the years since, it has continued to work to spread controversial content on its platform, including election misinformation, anti-vaccination narratives, violence, and hate speech.
The company has assembled a safety team, introduced automated programs to filter out or make false claims visible, and created a kind of independent supreme court known as an oversight board to make difficult arbitration decisions.
But now Mr. Zuckerberg is following in the footsteps of fellow social media leader Mr. Musk. After Musk acquired Company X, then known as Twitter, in 2022, he dismantled the company’s fact-checking team and made user-generated contextual labels called Community Notes the platform’s only vehicle. Correct any false claims.
Meta has announced that it is ending its partnership with third-party fact checkers and will be installing similar community notes across platforms including Facebook, Instagram, and Threads.
“I think Elon played a very important role in moving the discussion and getting people to focus again on freedom of expression. It was really constructive and productive,” Kaplan said.
The company also plans to adjust its automated system that scans for policy violations, resulting in “too much content that should not have been censored.” These systems will now focus on checking only illegal and “high severity” violations such as terrorism, child sexual exploitation, drugs, fraud and fraud. Other concerns must be reported by users before companies can evaluate them.
Zuckerberg said Tuesday that Meta’s complex system for moderating content inadvertently resulted in mass removal of non-infringing content from the platform. For example, if something goes wrong with the system 1% of the time, that could mean millions of the company’s more than 2 billion users.
“We’ve reached a point where there are too many mistakes and too much censorship,” Zuckerberg said.
However, Zuckerberg acknowledged that the new policy could create new challenges for content moderation.
“The reality is it’s a trade-off,” he said in the video. “This means less malicious material will be discovered, but it will also reduce the number of innocent posts and accounts that we accidentally delete.”
The company is also lifting content restrictions on certain topics, such as immigration and gender identity, and removing limits on the amount of politically-related content users can see in their feeds.
As part of the changes, Meta will relocate its trust and safety team responsible for content policy from California to Texas and other U.S. locations. “I think it helps build trust to do this work in a place where there’s less concern about bias on the team,” Zuckerberg said.
This is a developing story and will be updated.