In June, Global Witness and Foxglove found that Meta continued to approve Amharic ads aimed at Ethiopian users that included hate speech and calls to violence. Facebook has been implicated in spreading hate speech and fueling ethnic violence in Ethiopia’s ongoing conflict.
Crider argues that Facebook needs to invest more in its moderation practices and protections for democracy. He worries that even the threat of a ban will allow the company to deflect responsibility for problems it has left unresolved.
“I think ultimately the moment any regulator looks at Facebook and it looks like it’s going to make them do something that might cost them some money, they start howling about censorship and present a false option that it’s an essentially unmoderated and unregulated action. Facebook or no Facebook at all,” he says.
And Crider says there are things the company can do, including “glass-breaking” measures like deprioritizing its highly promoted live videos or limiting the reach of inflammatory content and banning election-related ads before the election. election.
Mercy Ndegwa, Director of Public Policy East and Horn of Africa at Meta, told WIRED that the company “has taken extensive steps to help us crack down on hate speech and inflammatory content in Kenya, and we are intensifying these efforts before the election.” He acknowledged, however, that “despite these efforts, we know there will be examples of things we miss or delete by mistake, because both machines and people make mistakes.” Meta did not respond to specific questions about the number of content moderators it has that speak Swahili or other Kenyan languages, or the nature of its discussions with the Kenyan government.
“What the researchers did was stress test Facebook’s systems and they showed that what the company was saying was bullshit,” says Madung. The fact that Meta allowed ads on the platform despite a review process “raises questions about its ability to handle other forms of hate speech,” Madung says, including the large amount of user-generated content that does not require approval prior
But banning Meta platforms, Madung says, won’t get rid of misinformation or ethnic tensions, because it doesn’t address the root cause. “This is not a mutually exclusive question,” he says. “We need to find a middle ground between heavy-handed approaches and real platform accountability.”
On Saturday, Joseph Mucheru, Cabinet Secretary for Internet and Communications Technologies (ICT), he tweeted, “The media, including social media, will continue to enjoy FREEDOM OF THE PRESS in Kenya. It is unclear what legal framework NCIC plans to use to suspend Facebook. The government is on record. We are NOT shutting down the Internet.” There is currently no legal framework that would allow NCIC to order Facebook’s suspension, agrees Bridget Andere, an Africa policy analyst at the digital rights nonprofit Access Now.
“Platforms like Meta have completely failed in dealing with misinformation, misinformation and hate speech in Tigray and Myanmar,” Andere said. “The danger is that governments will use this as an excuse to shut down the internet and block apps, when instead it should push companies to invest more in human content moderation, and make – in an ethical way and respecting human rights”.
Madung similarly worries that regardless of whether the government decides to suspend Facebook and Instagram now, the damage is already done. “The effects will be seen at a different time,” he says. “The point is that there is now an official precedent, and it can be referred to at any time.”