Data Privacy & Cybersecurity » Facebook Moderator Alleges Toxic Content Gave Her PTSD

Facebook Moderator Alleges Toxic Content Gave Her PTSD

October 3, 2018

A woman who worked several months as a Facebook content moderator has sued, alleging that “constant and unmitigated exposure to highly toxic and extremely disturbing images” gave her PTSD. The putative class action, filed in California state court, names as defendants both Facebook and Pro Unlimited, Inc., a company that contracts content moderator services with Facebook and is the plaintiff’s nominal employer. Facebook has thousands of content moderators worldwide who review millions of potentially offending posts daily. To bolster its case, the complaint references what one former Facebook moderator told a Guardian reporter to make his point:  “You go into work at 9 a.m. every morning, turn on your computer, and watch someone have their head cut off.” According to a Facebook spokesperson, the company is aware of the problem, takes moderator wellness seriously, provides extensive support, and requires companies it contracts with to do likewise. The company also explains to new hires what they are in for, and it exposes them only gradually to graphic content, says Facebook’s director of global training. The complaint acknowledges that Facebook helped craft industry standards for content moderators, but it maintains the company has failed to implement those standards.

 

Read full article at:

Daily Updates

Sign up for our free daily newsletter for the latest news and business legal developments.

Scroll to Top