Facebook risks being suspended in Kenya if it does not adhere to policies to prevent the spread of hate speech, a government agency has announced.
The National integration and Cohesion Commission (NCIC) says it has written to Meta, the company that owns Facebook, demanding a response to allegations of weak controls in moderating content on its platform ahead of next month’s elections.
The NCIC was responding to findings of a report by advocacy groups Global Witness and Foxglove which indicated that Facebook had failed to moderate content on its platform due to weak controls.
"If Facebook doesn’t comply with requirements we have set out within seven days, we will recommend they suspend their operations. We will not allow Facebook to jeopardise our national security”, NCIC's Dr David Makori said.
The commission, which was established in the wake of the violence that followed the 2007 election, does not have powers to suspend Facebook but can only make recommendations to the authorities if the firm fails to comply with guidelines as outlined in the law.
The Global Witness and Foxglove report says that the social media giant failed to detect advertisements with inflammatory content on its platform published in English and Swahili.
During the investigation, researchers submitted 20 ads with hateful language in English and Swahili that had originally been used in the 2007 elections.
The report says all the ads sent in bar one were approved.
One in English was rejected for not complying with Facebook's guidelines on hate speech.
Global Witness says the adverts were never published on Facebook but they were concerned by how they went through without detection.
Facebook has not responded to findings of this report.
In July, the technology firm was reported to have taken down 37,000 accounts for promoting hate speech and 42,000 for violating its violence and incitement policies in the run up to the August election.
The social media platform also said it had rejected 36,000 political adverts for not complying with its transparency rules.
Facebook's Director of Public Policy, East and Horn of Africa, Mercy Ndegwa, said enhanced controls had been placed on the platform which would make it easier to identify and remove content that could lead to election related violence.