Oversight Board expresses concern over Meta's removal of fact-checking

Oversight board voices concern about Meta nixing fact checks

  • The independent high court said the changes to how it handles offensive and potentially harmful messages were made “hastily” and said Meta must mitigate the human rights risks they pose.
  • In January, Mark Zuckerberg announced that Meta would introduce “Community Notes,” a feature similar to X to monitor the accuracy of posts on its platforms in the US.

SAN FRANCISCO: Independent watchdogs expressed concern Tuesday that Meta's recent decision to drop fact-checking on its Facebook platform could threaten human rights.
Meta's surprise announcement in January that it would end its fact-checking program in the US drew sharp criticism from disinformation researchers, who warned it could open the floodgates to false reports.
Now the board, which acts as the final court for Meta content moderation issues, said the social media giant's announcement of policy changes and measures to handle offensive and potentially harmful posts was “rushed,” according to a statement from the Meta Oversight Board released Tuesday.
“People have the right to express conflicting opinions,” said board co-chair Helle Thorning-Schmidt.
“People must also be protected from harm.”
Meta CEO Mark Zuckerberg posted the news in a dramatic shift in policy that analysts saw as an attempt to appease then-President-elect Donald Trump, who has equated fact-checking with censorship.
As Meta rolls out moderation changes globally, the board believes it is critical for the tech giant to minimize the human rights risks that could arise from reduced or no fact-checking.
The Oversight Board made 17 recommendations, including one that Meta evaluate the effectiveness of community notes versus third-party fact-checking, “especially in situations where the rapid spread of false information creates risks to public safety.”
Meta has hired third-party fact-checkers, including AFP, to debunk misinformation spread by the platform.
Zuckerberg said Meta, Facebook and Instagram platforms will instead use “X-like Community Notes” in the United States to monitor the accuracy of posts.
Community Notes is a crowdsourced moderation tool that X (formerly Twitter) has promoted as a way for users to add context to posts, but researchers have repeatedly questioned its effectiveness in combating falsehoods.
“You wouldn't rely on just anyone to stop your toilet from leaking, but Meta is now looking to rely on just anyone to stop the spread of misinformation on its platforms,” Michael Wagner of the University of Wisconsin-Madison's School of Journalism and Mass Communication told AFP when Meta announced the change.
“Asking people to volunteer to police false claims posted on Meta's multi-billion dollar social media platforms is an abdication of social responsibility.”
While Meta has promised to abide by the board's rulings on appeals of its decisions to remove or keep posts, the tech company is not obligated to follow its policy recommendations.



Source

Leave a Reply

Your email address will not be published. Required fields are marked *