Groups  express concern over Meta’s  change in fact-checking  policy

Started by bosman, 2025-01-10 10:48

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.


Groups  express concern over Meta's  change in fact-checking  policy, warn of  dangers of misinformation in Nigeria.

Advocacy groups led by the National Online Safety  Coalition's #FWDwithFacts  campaign have raised  alarm over Meta's recent decision to  end partnerships with third-party  fact-checkers.
The move, they  say, could exacerbate the spread of misinformation and hate speech,  especially in African  countries like  Nigeria.
Speaking on behalf of the National  Cyber Security Coalition on Friday in Abuja, Ms. Shirley Ewang highlighted the  serious implications of Meta's  decision.
"We note that social media platforms  such as Facebook and WhatsApp, with tens of millions of Nigerian users, remain  at the heart of the country's information ecosystem and can be  weaponized without adequate fact-checking and content moderation," she  said.
Mrs. Ewang emphasized that fact-checking should not be  seen as  censorship, but as an essential safeguard to protect society from harm. She urged Meta to  immediately reinstate its fact-checking  programs.
"The stakes are too high to allow misinformation to  spread unchecked," she  said.

Millieu
Meta recently announced a shift from third-party fact-checking to a "Community Notes" system, which allows users to collaboratively add context to potentially misleading posts  on platforms  like Facebook, Instagram, and Threads.
Meta CEO Mark Zuckerberg  said the change  is intended to reduce excessive censorship. The previous  use of independent fact-checkers led to the  removal of harmless content and  hindered free  speech.

"Meta's Director of Global  Affairs, Joel Kaplan, highlighted the success  of Community  Notes on a platform like X, formerly  Twitter. "We've seen this approach work  at X, where they  let their community decide when posts are potentially misleading and need more context, and  where people  from a  wide range of perspectives decide what  kind of context is helpful  to other  users."
"We think this could be a better way  to achieve our original  goal, which is to provide people with information about what they're  seeing, and  a way that's less  subject to bias," he  said.
However, the policy  change has sparked significant controversy, with critics expressing concerns that  removing professional fact-checkers could  allow the  uncontrolled spread of harmful  content.
The impact of  disinformation in  Nigeria
  • The coalition cited examples of past  disinformation that has caused real-world harm, including  the ethnic  conflict in Plateau State in 2018 and  the lies that sowed discord during Nigeria's 2023  elections. 

  • Ewang  stressed that the  lack of fact-checking initiatives could leave a dangerous  vacuum, leading to the  spread of  lies, social divisions and potential loss of  life. 

  • The coalition  called on African governments to demand greater transparency from  technology platforms  regarding their  disinformation strategies. 

  • It also called for laws  to hold technology companies accountable for harmful content and  for partnerships with civil society to promote media  literacy. "African governments must act now to protect their citizens and preserve the integrity of democracy,"  he added.


[attachment deleted by admin]