• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

ReZone

Collect Product Review

  • Home
  • Laptop
  • Mobile
  • Tablet
  • Smart Home
  • TV
  • Audio
  • Gaming

Facebook faces complaints from more former content moderators in lawsuit

February 28, 2019 by www.cnet.com

Two former Facebook content moderators have joined a lawsuit against the tech giant, alleging they suffered psychological trauma and symptoms of post-traumatic stress disorder caused by reviewing violent images on the social network. 

The lawsuit, which seeks class action status, alleges that Facebook violated California law by failing to provide thousands of content moderators with a safe workplace. The moderators were exposed to murders, suicides and beheadings that were live streamed on Facebook, according to the lawsuit. 

Former Facebook content moderators Erin Elder and Gabriel Ramos signed onto the amended lawsuit, which was filed Friday in a California superior court. The suit was originally filed in September by Selena Scola, a former Facebook content moderator who worked as a contractor at the tech company from June 2017 to March 2018. 

“This case has uncovered a nightmare world that most of us did not know about. The trauma and harm that the plaintiffs, and others who do content moderation work, have suffered is inestimable,” Steve Williams, a lawyer for Joseph Saveri law firm, which is representing the content moderators, said in a statement. “The fact that Facebook does not seem to want to take responsibility, but rather treats these human beings as disposable, should scare all of us.”  

Facebook denied the allegations in a November court filing. It has argued that the case should be dismissed.

Scola was an employee of PRO Unlimited, a Florida staffing firm that worked with Facebook to police content. The original suit named PRO Unlimited as a defendant, but the staffing company was dropped from the amended filing.

Elder worked as a Facebook content moderator from March 2017 to December 2017 through PRO Unlimited and Accenture, another staffing firm. She “has experienced nightmares, hypervigilance around children, depression, and pervasive sense of helplessness about her work as content moderator,” the lawsuit states.

Ramos, who was employed by PRO Unlimited, Accenture, Accenture Flex and US Tech Solutions, worked as a Facebook content moderator from June 2017 to April 2018. He also suffered symptoms of PTSD after viewing images and videos of graphic violence, according to the amended lawsuit. 

Concerns about the working conditions of Facebook content moderators have escalated recently amid reports of the toll those jobs are taking on workers. The social network, which has 15,000 content reviewers, outsources content moderation work to staffing firms such as Cognizant, Accenture and Genpact. 

At the same time, Facebook has been under pressure to prevent hate speech, violence and other offensive content from spreading throughout the social network. It’s defended its use of contract workers for the job and has pledged to improve support for content moderators.

Facebook didn’t immediately respond to a request for comment. 

5G and foldable phones go big at MWC 2019: With international intrigue and a 5G coming-out party, this show doesn’t need the boost of a Samsung event.

Galaxy S10 Plus ongoing review: Take a look at what’s good and bad so far.

  • TikTok hit by another lawsuit over working conditions for its content moderators
  • 'I Would See People Get Shot in the Face:' TikTok Ex-Moderators Sue Over On-the-Job Trauma
  • Sir Nick Clegg says Facebook has too much power over content moderation decisions
  • The state of Facebook content moderation
  • Former TikTok content reviewers file lawsuit over California labor law violation
  • Koo takes the Facebook route, set to launch advisory board
  • Former moderators sue TikTok over trauma caused by viewing "extremely disturbing" videos
  • 'Kill more': Facebook fails to detect hate against Myanmar's Rohingya muslim minority
  • Facebook may be underreporting images of child abuse, says report
  • Shock Claim: Facebook Moderators Told to 'Err on the Side of an Adult' with Potential Child Sexual Abuse Material
Facebook faces complaints from more former content moderators in lawsuit have 596 words, post on www.cnet.com at February 28, 2019. This is cached page on ReZone. If you want remove this page, please contact us.

Filed Under: Uncategorized facebook, tech industry, laser tek services, internet services, accenture plc, apple faces 999 billion dollar lawsuit, deeper attention to abusive user content moderation, dnc lawsuit complaint, facebook face recognition how to turn off, facebook face recognition how does it work, displayed content is moderated in truecaller, regulating online content moderation, facebook face recognition 10 year challenge, dispatches investigation reveals how facebook moderates content, facebook face scanning

Primary Sidebar

RSS Recent Stories

  • MSI Announces 27-Inch AiOs With Alder Lake-P
  • GPD Win Max 2 Crowdfunder Reveals Pricing
  • Facebook Video Creators Will Soon Have Access to More Tools | Digital Trends
  • Facebook Fleshes Out Video Tab in Mobile App by Testing Out Categories | Digital Trends
  • Facebook Live Viewers Can Chime In With Private Chats — Or Their Own Video Stream | Digital Trends

Sponsored Links

  • Call the Midwife star reveals real reason why you’ll never see former stars make a comeback
  • Ben Affleck addresses comments on feeling ‘trapped’ in Jennifer Garner marriage
  • Kate Middleton caught comforting Prince Charles – see the moment here
  • Strictly Come Dancing celebrity pulls out of final and apologises for change – read statement
  • Tom Holland and Zendaya Are Having an Extremely Cute
Copyright © 2022 ReZone. Power by Wordpress.
Home - About Us - Contact Us - Disclaimers - DMCA - Privacy Policy - Submit your story