Facebook content moderators in Kenya are describing their work as “torture,” as they are exposed to disturbing and graphic content while on the job. One employee, Nathan Nkunzimana, shared his experience of having to view videos depicting child abuse and murder.
These moderators, who work for a Facebook contractor, spend eight hours a day filtering and removing harmful content to protect users from such graphic material. The challenging and emotionally taxing nature of their work often leaves them overwhelmed, some even going as far as screaming or crying.
This disturbing revelation has led to a lawsuit filed by the moderators, which could potentially have widespread implications beyond Kenya.
A group of approximately 200 former Facebook content moderators in Kenya, including Nkunzimana, have filed a lawsuit against Facebook and local contractor Sama. The lawsuit revolves around the challenging working conditions faced by content moderators and could have significant consequences for social media moderators globally.
This legal action marks the first known court challenge outside of the United States, where Facebook previously reached a settlement with moderators in 2020. The outcome of this case could pave the way for improved working conditions for content moderators worldwide.
Former employees from Facebook’s outsourced content moderation hub in Nairobi, Kenya, are demanding a compensation fund of $1.6 billion. These content moderators from various African countries claim that they had to endure substandard working conditions, including inadequate mental health support and low wages.
Recently, they were terminated by Sama, the local contractor handling their employment, despite a court order stating that their contracts should be extended until the resolution of the case. The moderators allege that both Facebook and Sama have failed to comply with the court’s directive.
Both Facebook and Sama have defended their employment practices in response to the allegations made by the content moderators. They maintain that they have provided appropriate working conditions and support for their moderators, ensuring compliance with local labor laws.
The companies argue that they have taken steps to address the mental well-being of their content moderators, including providing access to counseling resources. Facebook and Sama are likely to present their arguments and evidence in court as the legal proceedings continue to unfold.
The content moderators involved in the lawsuit face uncertainty regarding the duration of the case and express their despair over dwindling financial resources and expiring work permits.
As they continue to grapple with the traumatic and disturbing images they were exposed to during their employment, the lack of resolution in their legal battle adds to their distress. The mental toll of their work lingers, compounding their already challenging circumstances.
Nkunzimana, a father of three from Burundi, emphasized the important role of content moderators like himself in ensuring a safe browsing experience for Facebook users.
He pointed out that behind the scenes, moderators are diligently reviewing and assessing the content to determine its appropriateness for the platform. Their responsibility is to scrutinize whether the material adheres to Facebook’s guidelines, protecting users from harmful and inappropriate content.
Nkunzimana’s statement highlights the critical yet often overlooked contribution of content moderators in maintaining the integrity and safety of the platform.
Nkunzimana compares the role of content moderators to that of soldiers, willing to take a bullet for the sake of Facebook users.
These moderators are exposed to disturbing and harmful content, such as videos depicting killings, suicides, and sexual assault, as they diligently work to ensure that such content is swiftly removed from the platform.
Their difficult task entails enduring the emotional toll of witnessing these distressing materials and making crucial judgments to safeguard the well-being of Facebook’s user community.
Nkunzimana’s analogy highlights the sacrifice and bravery exhibited by content moderators in their commitment to upholding the platform’s standards and protecting its users.
Initially, Nkunzimana and his fellow moderators took pride in their work, considering themselves as “heroes to the community.”
However, the repeated exposure to distressing content had a profound impact, particularly for those who had fled political or ethnic violence in their home countries. Past traumas were reignited, and the moderators found themselves lacking support and functioning within a culture of secrecy.
To maintain confidentiality, moderators were required to sign nondisclosure agreements, while personal belongings like phones were prohibited in the workplace.
After enduring their shifts, Nkunzimana would return home physically and emotionally drained, isolating himself in his bedroom in an attempt to forget the disturbing images he had encountered.
Even his wife remained unaware of the true nature of his job, highlighting the challenges these moderators faced in finding understanding and support outside of their workplace.
Nkunzimana now finds solace in isolating himself in his room, avoiding his sons’ questions about why he is no longer working and why the family is struggling to afford school fees.
As a content moderator, he earned a monthly salary of $429, with non-Kenyans receiving a small expat allowance in addition.
Unfortunately, the Facebook contractor, Sama, based in the United States, did little to provide adequate post-traumatic professional counseling for moderators in their Nairobi office, according to Nkunzimana.
He revealed that counselors were poorly trained and ill-equipped to handle the emotional and psychological toll experienced by his colleagues.
With a lack of mental health care support, Nkunzimana has turned to his faith and actively engages in church activities for emotional solace and support.
Meta, the parent company of Facebook, has stated that its contractors are contractually obligated to pay their employees above the industry standard in the respective markets they operate in.
Additionally, Meta emphasizes the importance of on-site support provided by trained practitioners to assist content moderators in coping with the challenges they face. These measures are intended to ensure the well-being and support of content moderators in their critical role in maintaining a safe online environment.
Meta has declined to provide a comment specifically on the ongoing case in Kenya involving the content moderators. As a result, the company has refrained from offering any direct statement or response regarding the allegations made by the moderators and the lawsuit against Facebook and its contractor.
Sama, in an email response to the Associated Press, stated that the salaries it offered in Kenya were four times higher than the local minimum wage. T
he company further noted that prior to their employment, a significant number of employees were living below the international poverty line (less than $1.90 a day). Sama claimed that it provided all employees with unlimited access to one-on-one counseling sessions, ensuring that they could seek support without any fear of reprisals.
Regarding the recent court decision to extend the moderators’ contracts, Sama referred to it as “confusing” and contended that a subsequent ruling has halted its implementation.
Sarah Roberts, an expert in content moderation at the University of California, Los Angeles, acknowledged that such work has the potential to be highly psychologically damaging.
However, she suggested that job-seekers in lower-income countries might be willing to undertake the risk in exchange for a job within the tech industry.
According to experts like Sarah Roberts, the outsourcing of sensitive content moderation work to countries such as Kenya reflects a larger issue of an exploitative industry that capitalizes on global economic inequalities to its advantage.
This industry benefits from utilizing cheap labor while evading responsibility for the harmful effects it can have on workers.
By outsourcing the work to third-party contractors, the firms involved can distance themselves from direct employment and the associated liabilities. This practice raises ethical concerns, as it perpetuates the cycle of economic inequity and places the burden on vulnerable workers without adequate support or recourse.
Sarah Roberts, an associate professor of information studies, expressed concerns about the quality of mental health care provided to content moderators and the confidentiality of therapy.
She highlighted that in cases like the one in Kenya, where moderators are organizing and pushing back against their working conditions, there is unusual visibility and potential for significant impact.
Unlike in the United States, where it is common for companies to settle such cases, this may not be as easy in other jurisdictions.
Facebook established moderation hubs worldwide, including in Kenya, in response to accusations of allowing hate speech to circulate in countries like Ethiopia and Myanmar, where conflicts resulted in significant loss of life.
Content moderators hired by Sama in Kenya were sought for their fluency in various African languages, but the graphic and disturbing content they had to review often hit close to home, particularly for individuals like Fasica Gebrekidan, who worked as a moderator while experiencing the conflict in Ethiopia’s Tigray region.
Gebrekidan had to watch and evaluate graphic videos related to the war, including instances of rape, further adding to the emotional toll she experienced personally and professionally.
Fasica Gebrekidan, who had initially felt grateful for the job opportunity, soon lost that sentiment. Instead, she described her experience as a form of torture.
Having fled the war in her native Tigray region in Ethiopia, she found herself confronted with the very content related to the conflict that she had been trying to escape. The emotional toll of having to watch graphic videos and disturbing content related to the war was overwhelming for her and her fellow moderators.
The initial gratitude turned into a sense of despair and torment as they had to confront the horrors they had experienced firsthand.
Fasica Gebrekidan now finds herself without a source of income and a stable place to call home. The emotional toll of her content moderation job has left her feeling unable to function normally.
As a former journalist, she used to find solace in writing, but now she’s unable to bring herself to continue even as a means of expressing her emotions.
Fasica worries that the disturbing content she had to view during her time as a content moderator will continue to haunt her indefinitely.
During her conversation with the Associated Press, she struggled to turn her attention away from a painting across the café, depicting a deeply distressed figure in a vivid red color. The painting deeply unsettled her.
She attributes the lack of proper mental health care and low pay to Facebook, and accuses the local contractor of exploiting her and subsequently letting her go.
Fasica holds both Facebook and the contractor responsible for failing to address the well-being of content moderators like herself.
Fasica expressed her belief that Facebook should be aware of the challenges and difficulties faced by content moderators and should demonstrate genuine concern for their well-being. In her view, the company should take responsibility for the treatment of its moderators.
The outcome of the moderators’ complaint rests in the hands of the Kenyan court, with the next hearing scheduled for July 10. The prolonged uncertainty surrounding the case is frustrating for Fasica and her colleagues. While some moderators have chosen to give up and return to their home countries, Fasica is not yet able to pursue that option for herself.