NAIROBI, Kenya — Nathan Nkunzimana, overcome with emotions, recounted his experiences of watching distressing videos as a content moderator for a Facebook contractor. In his role, he had to view disturbing content for eight hours a day, witnessing child abuse and the killing of a woman. The toll was immense, leading some of his overwhelmed colleagues to scream or cry.
Now, Nkunzimana is part of a group of nearly 200 former employees in Kenya who are taking legal action against Facebook and its local contractor, Sama. This lawsuit has significant implications for social media moderators globally and marks the first known legal challenge outside the United States, where Facebook reached a settlement with moderators in 2020.
The group of moderators was employed at Facebook's outsourced content moderation hub in Nairobi, Kenya. Their role involved screening and removing illegal or harmful content from users across Africa, in line with Facebook's community standards and terms of service. Alleging poor working conditions, including inadequate mental health support and low pay, moderators from various African countries are seeking a $1.6 billion compensation fund. After being laid off by Sama earlier this year, despite a court order to extend their contracts, the moderators face uncertainty and financial strain as they grapple with the traumatic images they were exposed to.
Facebook and Sama have defended their employment practices amidst the legal challenge. The moderators expressed their despair as their funds dwindle and work permits expire, highlighting the haunting impact of the distressing content they encountered. Nathan Nkunzimana, a 33-year-old father of three from Burundi, likened content moderation to soldiers taking a bullet for Facebook users. They were constantly exposed to harmful content depicting violence, suicide, and sexual assault, diligently working to ensure its removal.
Initially, Nkunzimana and his fellow moderators took pride in their job, viewing themselves as community heroes. However, the exposure to disturbing content triggered past traumas, particularly for those who had fled political or ethnic violence in their home countries. The moderators received little support and worked in a culture of secrecy, being required to sign nondisclosure agreements and forbidden from bringing personal items, such as phones, to work.
After finishing his shift, Nkunzimana would return home exhausted, seeking solace by isolating himself in his bedroom, attempting to forget the distressing images he encountered. Even his wife remained unaware of the nature of his work. Presently, he shuts himself away to avoid his children's questions about why he is no longer employed and why they may struggle to afford school fees. Content moderators earned a monthly salary of $429, with non-Kenyans receiving a small expat allowance on top of that.
The U.S.-based contractor Sama, responsible for Facebook's moderation in Nairobi, made insufficient efforts to provide post-traumatic professional counseling to the moderators. Nkunzimana mentioned that the counselors were ill-equipped to handle the experiences his colleagues were going through. With no access to mental health care, he now seeks solace through his involvement in church activities.
Meta, the parent company of Facebook, has stated that its contractors have a contractual obligation to pay employees above the industry standard in the markets they operate in and provide on-site support from trained practitioners. However, Meta declined to comment on the specific case in Kenya.
According to Sama, the contractor responsible for Facebook's moderation in Nairobi, the salaries offered in Kenya were four times the local minimum wage. Sama also claimed that a significant percentage of its employees were living below the international poverty line before being hired. Sama asserted that all employees had unlimited access to one-on-one counseling without fear of repercussions. The contractor found the court decision to extend the moderators' contracts confusing and stated that a subsequent ruling pausing that decision means it has not yet taken effect.
Sarah Roberts, an expert in content moderation at the University of California, Los Angeles, noted that such work has the potential to cause significant psychological harm. However, individuals in lower-income countries may be willing to take on the risks in exchange for a job in the tech industry. Roberts described the outsourcing of sensitive content moderation work in countries like Kenya as an exploitative industry that takes advantage of global economic inequality, causing harm while evading responsibility by attributing employment to third-party contractors.
According to Sarah Roberts, concerns have been raised about the quality of mental health care provided to moderators, as well as the confidentiality of therapy. As an associate professor of information studies, Roberts highlighted that the difference in the Kenya court case is the moderators' organization and pushback against their working conditions, which has brought unprecedented visibility to the issue. Unlike in the United States, where settlements are common, bringing similar cases in other jurisdictions may pose challenges for companies seeking to settle.
In response to accusations of allowing hate speech to circulate, Facebook established moderation hubs worldwide, including in Kenya. These hubs employed content moderators fluent in various African languages. However, these moderators soon found themselves exposed to graphic content that directly affected their own communities.
For example, Fasica Gebrekidan, who worked as a moderator, experienced the overlap between her job and the war in her native Tigray region of Ethiopia. She had to view "gruesome" videos and other content related to the conflict, including instances of rape. The process required her to watch the first and last 50 seconds of each video to make decisions regarding removal.
Fasica, who initially felt grateful for her job as a moderator, quickly became disillusioned. Having fled from the war in her native Tigray region of Ethiopia, she found herself confronted with graphic content related to the conflict in her work. The experience was torturous for her, and she now finds herself without income or a permanent home. She is unable to pursue her passion for writing and worries that the distressing images will haunt her indefinitely.
Fasica holds Facebook responsible for the lack of adequate mental health support and fair pay, while also blaming the local contractor for exploiting her and subsequently letting her go. She believes that Facebook should be aware of the situation and demonstrate care for their moderators. The fate of the moderators' complaint now rests with the Kenyan court, with the next hearing scheduled for July 10. Fasica expresses frustration with the uncertainty, noting that some moderators have already given up and returned to their home countries, although that is not yet an option for her.