Hitmetrix - User behavior analytics & recording

The Impact of Content Moderation on Facebook Contractors in Kenya

Facebook

Content moderation is an essential aspect of maintaining a safe and healthy online environment. However, the toll it takes on the individuals tasked with this responsibility cannot be overlooked. In recent news, Facebook content moderators in Kenya have come forward to shed light on the challenges they face and the impact it has on their well-being. These moderators, employed by a local contractor called Sama, are now seeking justice through a lawsuit against Facebook and Sama.

Content moderators play a crucial role in ensuring that social media platforms like Facebook adhere to their community standards and terms of service. They review and screen posts, videos, messages, and other content to identify and remove any illegal or harmful material. This includes graphic content depicting violence, sexual assault, and other disturbing acts. Their work is vital in protecting users from exposure to harmful content and maintaining the integrity of the platform.

While content moderation is necessary, it comes with significant challenges and risks. Facebook content moderators in Kenya have described their work as “torture,” highlighting the emotional and psychological toll it takes on them. These moderators spend hours each day watching videos and reviewing posts that showcase graphic and disturbing content. They witness acts of violence, abuse, and even murder, all in an effort to ensure the safety of Facebook users.

Nathan Nkunzimana, a former content moderator, recalled watching videos of child molestation and murder, often leaving him on the verge of tears. The exposure to such horrific content can lead to severe emotional distress and trauma. Many moderators have reported feeling overwhelmed, with some even experiencing episodes of screaming or crying as a result.

A lawsuit against Facebook and Sama, their local contractor, has been launched by over 200 former content moderators for Facebook in Kenya. The moderators are asking for a $1.6 billion compensation fund, alleging bad working conditions, a lack of mental health care, and inadequate wages. They contend that Facebook and Sama neglected to give their employees the assistance and treatment they needed, which had a negative impact on their mental health.

The moderators allege that they were subjected to harsh working conditions, including long hours and exposure to disturbing content without proper psychological support. They claim that Sama ignored a court order to extend their contracts until the case is resolved, leaving them jobless and without any income. The moderators insist that these companies have neglected their duty of care towards their employees, resulting in significant harm.

The nature of content moderation can have severe consequences for the mental health of those involved. Many of the moderators in Kenya have experienced a resurgence of past traumas, especially for those who have fled political or ethnic violence in their home countries. The constant exposure to violent and distressing content can trigger anxiety, depression, and post-traumatic stress disorder (PTSD).

Nkunzimana, a father of three from Burundi, likens content moderation to soldiers taking a bullet for Facebook users. He and his colleagues watch harmful content depicting killing, suicide, and sexual assault, ensuring its removal from the platform. Initially, Nkunzimana and others took pride in their work, feeling like heroes protecting the community. However, the lack of support and the culture of secrecy surrounding their job took a toll on their mental well-being.

One of the key grievances raised by the moderators is the lack of proper mental health support and counseling. Sama, the local contractor responsible for employing the moderators, did not prioritize post-traumatic professional counseling for its employees in the Nairobi office. The moderators claim that the counselors provided were ill-equipped to handle the emotional toll of their work. In the absence of proper mental health care, some moderators have sought solace in religious practices to cope with the distressing nature of their job.

Additionally, the moderators argue that they were not adequately compensated for the challenges they faced. The salary for content moderators was as low as $429 per month, with non-Kenyans receiving a small expat allowance on top of that. Sama defends its employment practices, stating that the salaries it offered in Kenya were four times the local minimum wage. However, the moderators assert that this is still insufficient given the nature of their work and the toll it takes on their mental well-being.

This lawsuit by Facebook content moderators in Kenya has the potential to send ripples throughout the world. It is the first known court challenge outside the United States, where Facebook settled with moderators in 2020. The outcome of this case could set a precedent for content moderators globally, highlighting the need for better working conditions, mental health support, and fair compensation.

Sarah Roberts, an expert in content moderation at the University of California, Los Angeles, emphasizes the potential psychological damage associated with such work. She acknowledges that individuals in lower-income countries may be willing to take on the risks in exchange for an office job in the tech industry. However, this outsourcing of sensitive work to countries like Kenya raises concerns about exploitation and the lack of responsibility taken by the companies involved.

The plight of Facebook content moderators in Kenya brings attention to the urgent need for change in the industry. Companies like Facebook must prioritize the well-being of their content moderators and ensure they receive adequate support, counseling, and compensation. The mental health impact of content moderation cannot be underestimated, and it is crucial to provide resources and care for those exposed to distressing content on a daily basis.

Furthermore, the outsourcing of content moderation to lower-income countries requires greater scrutiny and adherence to ethical standards. Companies should not exploit global economic inequities but instead take responsibility for the well-being of their employees, regardless of their location. The visibility and organization of the moderators in Kenya have created an opportunity to address these issues and push for change in the content moderation industry.

The lawsuit filed by Facebook content moderators in Kenya sheds light on the challenges and risks associated with content moderation. The emotional toll it takes on these individuals cannot be ignored, and it is crucial for companies like Facebook to prioritize the well-being of their moderators. Adequate mental health support, fair compensation, and better working conditions are essential to protect the mental well-being of those tasked with maintaining a safe online environment. The outcome of this lawsuit may have far-reaching implications for content moderators worldwide, paving the way for a more compassionate and responsible industry.

First reported by AP News.

Total
0
Shares
Related Posts