Two former Facebook content moderators have joined a lawsuit against the tech giant, alleging they suffered psychological trauma and symptoms of post-traumatic stress disorder caused by reviewing violent images on the social network.
The lawsuit, which seeks class action status, alleges that Facebook violated California law by failing to provide thousands of content moderators with a safe workplace. The moderators were exposed to murders, suicides and beheadings that were live streamed on Facebook, according to the lawsuit.
Former Facebook content moderators Erin Elder and Gabriel Ramos signed onto the amended lawsuit, which was filed Friday in a California superior court. The suit was originally filed in September by Selena Scola, a former Facebook content moderator who worked as a contractor at the tech company from June 2017 to March 2018.
“This case has uncovered a nightmare world that most of us did not know about. The trauma and harm that the plaintiffs, and others who do content moderation work, have suffered is inestimable,” Steve Williams, a lawyer for Joseph Saveri law firm, which is representing the content moderators, said in a statement. “The fact that Facebook does not seem to want to take responsibility, but rather treats these human beings as disposable, should scare all of us.”
Facebook denied the allegations in a November court filing. It has argued that the case should be dismissed.
Scola was an employee of PRO Unlimited, a Florida staffing firm that worked with Facebook to police content. The original suit named PRO Unlimited as a defendant, but the staffing company was dropped from the amended filing.
Elder worked as a Facebook content moderator from March 2017 to December 2017 through PRO Unlimited and Accenture, another staffing firm. She “has experienced nightmares, hypervigilance around children, depression, and pervasive sense of helplessness about her work as content moderator,” the lawsuit states.
Ramos, who was employed by PRO Unlimited, Accenture, Accenture Flex and US Tech Solutions, worked as a Facebook content moderator from June 2017 to April 2018. He also suffered symptoms of PTSD after viewing images and videos of graphic violence, according to the amended lawsuit.
Concerns about the working conditions of Facebook content moderators have escalated recently amid reports of the toll those jobs are taking on workers. The social network, which has 15,000 content reviewers, outsources content moderation work to staffing firms such as Cognizant, Accenture and Genpact.
At the same time, Facebook has been under pressure to prevent hate speech, violence and other offensive content from spreading throughout the social network. It’s defended its use of contract workers for the job and has pledged to improve support for content moderators.
Facebook didn’t immediately respond to a request for comment.
- TikTok hit by another lawsuit over working conditions for its content moderators
- 'I Would See People Get Shot in the Face:' TikTok Ex-Moderators Sue Over On-the-Job Trauma
- Sir Nick Clegg says Facebook has too much power over content moderation decisions
- The state of Facebook content moderation
- Former TikTok content reviewers file lawsuit over California labor law violation
- Koo takes the Facebook route, set to launch advisory board
- Former moderators sue TikTok over trauma caused by viewing "extremely disturbing" videos
- 'Kill more': Facebook fails to detect hate against Myanmar's Rohingya muslim minority
- Facebook may be underreporting images of child abuse, says report
- Shock Claim: Facebook Moderators Told to 'Err on the Side of an Adult' with Potential Child Sexual Abuse Material
Facebook faces complaints from more former content moderators in lawsuit have 596 words, post on www.cnet.com at February 28, 2019. This is cached page on ReZone. If you want remove this page, please contact us.