More Facebook Content Moderators Join Legal Battle Claiming Harm to Mental Health


Nineteen Facebook content moderators filed a class action complaint against Facebook and Cognizant Technology Solutions, a contractor hired by Facebook, for personal injury they allege comes from the content they are required to view at work and the lack of resources available to deal with the trauma they receive.

The complaint says they are suing to protect themselves and other moderators from the danger of psychological trauma caused by failure by the defendants to provide a safe workplace.  “Every day, Facebook users post millions of videos, images, and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, racist violence and murder,” the complaint states. “From their cubicles during the overnight shift in Cognizant’s Tampa and Phoenix offices,  Plaintiffs witnessed thousands of acts of extreme and graphic violence.”

The lawsuit was filed on March 12 in a circuit court in Hillsborough County, Florida and will be heard by Judge Mary S. Scriven. The Tampa Cognizant facility allegedly have been referred to in the press as a “sweatshop” and is considered the worst-performing site in North America averaging a 92 percent accuracy rate instead of the 98 percent target rate. They also claim they face “relentless pressure” to better enforce the community standards, which change frequently.

Other content moderators have previously filed lawsuits against Facebook and Cognizant or other contracted companies including a man in Ireland, and employees in California. The conversation has also been covered in the media with Wall Street Journal calling Facebook content moderation the “worst job in technology” and an in-depth article from The Verge. According to the complaint, a Facebook monitor told the Guardian, “You’d go into work … turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off.” 

The complaint says Facebook has helped draft safety standards to prevent content moderators from workplace trauma including psychological screening, counseling and altering resolution and audio of the videos to make them less graphic. Standards set also include working in pairs or having additional breaks. However, the employees claim the defendants have not implemented any of these safety standards.

The plaintiffs in the case also argue they are often the first to see and report emergency situations and should be considered first responders. “Plaintiffs and the other content moderators, at a minimum, deserve the same protections as other first responders, which includes workers’ compensation/health coverage for the PTSD caused by the working conditions,” the complaint states.

The content moderators are asking for the defendants to provide tools and mental health support, establish a medical monitoring fund for testing and treatment, and provide monetary compensation to those who have lost wages and gained health expenses because of their employment.