Content moderators working for Meta, the parent company of Facebook, have accused a contractor of disregarding threats made against them by Ethiopian rebels, according to court documents filed on December 4. The accusations come as part of a legal case involving the dismissal of dozens of moderators in Kenya.
The moderators, previously employed by Sama, a Kenya-based firm contracted to moderate Facebook content, claim they were targeted by the Oromo Liberation Army (OLA) rebel group for removing graphic content linked to the group. Despite receiving direct threats, the moderators allege their concerns were dismissed by Sama, which initially accused them of fabricating the threats before eventually investigating the claims. One moderator was reportedly relocated to a safehouse after being publicly identified by the OLA.
The moderators are among 185 individuals who sued Meta and its contractors last year. The group alleged that they lost their jobs with Sama for attempting to form a union and were subsequently blacklisted from applying for similar roles with Meta's new contractor, Majorel.
Abdikadir Alio Guyo, one of the affected moderators, stated in his affidavit that the OLA had threatened them, warning against the removal of their content from Facebook. “They told us to stop removing their content from Facebook or else we would face dire consequences,” Guyo said. Another moderator, Hamza Diba Tubi, described receiving a message listing the names and addresses of moderators, which has left him living in fear.
Sama declined to comment on the allegations, while Meta and the OLA did not respond to requests for comment.
The court documents revealed that Meta ignored recommendations from experts it hired to address hate speech in Ethiopia. Supervisors overseeing the moderators expressed frustration over having to review inflammatory content that did not violate Meta’s policies, leading to an “endless loop of hateful content.”
The accusations against Meta extend beyond threats to moderators. In a separate case filed in 2022, Meta was accused of allowing violent and hateful content to proliferate on Facebook, exacerbating Ethiopia's civil war between federal forces and Tigrayan regional authorities.
The ongoing legal battle could set a precedent for how Meta works with content moderators worldwide. Out-of-court settlement talks collapsed in October, and the case has brought renewed attention to the risks faced by individuals tasked with moderating graphic and inflammatory content.
The OLA, which has been accused of killing civilians following failed peace talks in 2023, remains a contentious force in Ethiopia's Oromiya region, where the group claims to represent the grievances of the Oromo community.
As the case unfolds, Meta faces increasing scrutiny over its approach to content moderation and the safety of its contractors in conflict zones.