Two former Facebook content moderators have joined a lawsuit against the tech giant, alleging they suffered psychological trauma and symptoms of post-traumatic stress disorder caused by reviewing violent images on the social network.
The lawsuit, which seeks class action status, alleges that Facebook violated California law by failing to provide thousands of content moderators with a safe workplace. The moderators were exposed to murders, suicides and beheadings that were live streamed on Facebook, according to the lawsuit.
Former Facebook content moderators Erin Elder and Gabriel Ramos signed onto the amended lawsuit, which was filed Friday in a California superior court. The suit was originally filed in September by Selena Scola, a former Facebook content moderator who worked as a contractor at the tech company from June 2017 to March 2018.
“This case has uncovered a nightmare world that most of us did not know about. The trauma and harm that the plaintiffs, and others who do content moderation work, have suffered is inestimable,” Steve Williams, a lawyer for Joseph Saveri law firm, which is representing the content moderators, said in a statement. “The fact that Facebook does not seem to want to take responsibility, but rather treats these human beings as disposable, should scare all of us.”
Facebook denied the allegations in a November court filing. It has argued that the case should be dismissed.
Scola was an employee of PRO Unlimited, a Florida staffing firm that worked with Facebook to police content. The original suit named PRO Unlimited as a defendant, but the staffing company was dropped from the amended filing.
Elder worked as a Facebook content moderator from March 2017 to December 2017 through PRO Unlimited and Accenture, another staffing firm. She “has experienced nightmares, hypervigilance around children, depression, and pervasive sense of helplessness about her work as content moderator,” the lawsuit states.
Ramos, who was employed by PRO Unlimited, Accenture, Accenture Flex and US Tech Solutions, worked as a Facebook content moderator from June 2017 to April 2018. He also suffered symptoms of PTSD after viewing images and videos of graphic violence, according to the amended lawsuit.
Concerns about the working conditions of Facebook content moderators have escalated recently amid reports of the toll those jobs are taking on workers. The social network, which has 15,000 content reviewers, outsources content moderation work to staffing firms such as Cognizant, Accenture and Genpact.
At the same time, Facebook has been under pressure to prevent hate speech, violence and other offensive content from spreading throughout the social network. It’s defended its use of contract workers for the job and has pledged to improve support for content moderators.
Facebook didn’t immediately respond to a request for comment.
- Cognizant’s $250m-a-year content moderation business up for grabs
- Oversight board will be a level above 30k content moderators: Brent C Harris, Facebook
- Indian content moderators' win from data policies in India & Europe
- 'Stressed' Facebook content moderator died on duty in US: Report
- Facebook content moderation guidelines leaked
- Barclays, RBS and other banks face £1bn forex rigging lawsuit
- Actress Who Played Sonia Gandhi Checked by Facebook for 'Political Content'
- Oversight board will help to keep the platform safe: Brent C Harris, Facebook
- Report: Facebook Is Fueling Violence in Sri Lanka, and Did Basically Nothing to Stop It
- Facebook apologizes after report shows inconsistencies in removing hate speech
Facebook faces complaints from more former content moderators in lawsuit have 582 words, post on www.cnet.com at February 28, 2019. This is cached page on ReZone. If you want remove this page, please contact us.