Two former Facebook content moderators have joined a lawsuit against the tech giant, alleging they suffered psychological trauma and symptoms of post-traumatic stress disorder caused by reviewing violent images on the social network.
The lawsuit, which seeks class action status, alleges that Facebook violated California law by failing to provide thousands of content moderators with a safe workplace. The moderators were exposed to murders, suicides and beheadings that were live streamed on Facebook, according to the lawsuit.
Former Facebook content moderators Erin Elder and Gabriel Ramos signed onto the amended lawsuit, which was filed Friday in a California superior court. The suit was originally filed in September by Selena Scola, a former Facebook content moderator who worked as a contractor at the tech company from June 2017 to March 2018.
“This case has uncovered a nightmare world that most of us did not know about. The trauma and harm that the plaintiffs, and others who do content moderation work, have suffered is inestimable,” Steve Williams, a lawyer for Joseph Saveri law firm, which is representing the content moderators, said in a statement. “The fact that Facebook does not seem to want to take responsibility, but rather treats these human beings as disposable, should scare all of us.”
Facebook denied the allegations in a November court filing. It has argued that the case should be dismissed.
Scola was an employee of PRO Unlimited, a Florida staffing firm that worked with Facebook to police content. The original suit named PRO Unlimited as a defendant, but the staffing company was dropped from the amended filing.
Elder worked as a Facebook content moderator from March 2017 to December 2017 through PRO Unlimited and Accenture, another staffing firm. She “has experienced nightmares, hypervigilance around children, depression, and pervasive sense of helplessness about her work as content moderator,” the lawsuit states.
Ramos, who was employed by PRO Unlimited, Accenture, Accenture Flex and US Tech Solutions, worked as a Facebook content moderator from June 2017 to April 2018. He also suffered symptoms of PTSD after viewing images and videos of graphic violence, according to the amended lawsuit.
Concerns about the working conditions of Facebook content moderators have escalated recently amid reports of the toll those jobs are taking on workers. The social network, which has 15,000 content reviewers, outsources content moderation work to staffing firms such as Cognizant, Accenture and Genpact.
At the same time, Facebook has been under pressure to prevent hate speech, violence and other offensive content from spreading throughout the social network. It’s defended its use of contract workers for the job and has pledged to improve support for content moderators.
Facebook didn’t immediately respond to a request for comment.
- Kanye West faces two class-action lawsuits and $30 million in damages for 'mistreating and failing to pay up to 1,000 workers at his extravagant Sunday Service shows'
- More Data on Content Moderation Won't Silence Facebook’s Critics
- Facebook content moderators in Ireland demand work-from-home rights
- Facebook faces a reckoning in Myanmar after blocked by military
- Zuckerberg, Dorsey and Pichai face off with Congress over Section 230, free speech
- Facebook's oversight board asks for public comment on Trump case
- Facebook's better-than-expected quarter is just the calm before the storm
- Facebook appoints its first chief compliance officer amid ongoing scrutiny
- Facebook’s Oversight Board Has Spoken. But It Hasn’t Solved Much
- Facebook developing tool that lets advertisers avoid certain topics
Facebook faces complaints from more former content moderators in lawsuit have 591 words, post on www.cnet.com at February 28, 2019. This is cached page on ReZone. If you want remove this page, please contact us.