Meta’s Shift to Community Notes Raises Concerns Over Misinformation and Job Losses in Africa
by News Ghana · News GhanaMeta’s decision to replace its fact-checking program with community-driven notes on its platforms—Instagram, Facebook, and Threads—has sparked significant concern among content moderation companies, fact-checkers, and civil rights activists in Africa.
Announced on January 7, 2025, the move has been described as a potential setback for the fight against misinformation in the region, particularly as it could lead to widespread job losses in countries like Kenya, Nigeria, Egypt, and South Africa, where many workers are employed by content moderation firms.
Meta’s new approach will empower users to flag potentially misleading content and add context to posts, replacing the third-party fact-checkers it has relied on since 2016. This decision follows claims that the previous fact-checking system had been used to “censor” content, a shift that could disrupt established efforts to combat the spread of disinformation.
Critics, including Emmanuel Chenze, the COO of African Uncensored, warn that the absence of a strong fact-checking framework could lead to significant consequences in Africa. “We’ve seen the mess caused by the lack of fact-checking initiatives during the 2017 election period,” said Chenze, pointing to the disinformation campaigns and the influence of entities like Cambridge Analytica. “Now, we risk returning to that situation without a proper framework to address misinformation.”
The decision is also expected to impact organizations like PesaCheck, which has long depended on funding from Meta. In 2023, Meta contributed 43% of PesaCheck’s funding, and the loss of this financial support could hinder their ability to tackle harmful content and safeguard public discourse.
“I fear the implications of this shift, especially when it comes to job losses for content moderators and the reduced capacity of organisations to do their critical work,” Chenze added. “It’s a grim outlook for a region already battling the spread of misinformation.”
While some view Meta’s new community notes system as a way to encourage user engagement and transparency in content moderation, critics argue that in regions where misinformation is often politically charged, this approach could allow false narratives to go unchecked. In politically sensitive countries, the system might be manipulated by social or political interests, allowing misleading content to spread.
This move also signals a shift in Meta’s relationship with third-party moderation firms like Africa Check and PesaCheck, which rely heavily on the tech giant’s funding. Without this financial backing, these organizations may struggle to address harmful content, further exacerbating the misinformation crisis in Africa.
Content moderation companies in Africa, particularly in countries like Kenya, face an uncertain future. Firms such as Sama and Majorel, which had worked with Meta on content moderation, have already pulled out of the business. Sama, which previously flagged harmful content, now focuses on AI data labeling, leaving many moderators without work.
The wider implications of this shift may also affect Meta’s standing with international regulatory bodies. The European Union’s Digital Services Act, which mandates that platforms like Meta address illegal content or face fines, could impose additional scrutiny on the company’s new approach to content moderation. As Meta plans to roll out community notes in the U.S., the European Commission has indicated it will be closely monitoring the situation.
As Meta moves forward with its community-driven content moderation model, the growing concerns over misinformation, job security, and the future of fact-checking in Africa are unlikely to dissipate, with many questioning whether this approach can truly replace the existing safeguards against harmful content.