an i phone screen showing twiiter and facebook icons

Facebook to Pay Massive $52 Million Settlement to Traumatized Content Moderators

Recommended Videos

The internet is very often a terrible place. Sure there are cat videos, and goat videos, weird videos, and more cat videos, but there are many horrible people saying and doing horrible things. If you know your way around the net, you can avoid much of this grossness. But for some people, that’s not an option, and it’s their job to moderate the very worst of the net. And now, some of them are getting compensated for the very real trauma and PTSD that job brought one.

In a massive victory for moderators, Facebook has settled a class-action lawsuit brought on behalf of current and former moderators who were tasked with removing the most graphic and disturbing posts on the platform for their job. Facebook has agreed to pay $52 million to the moderators in compensation for mental health issues developed on the job.

The settlement covers over 11,250 moderators, who were contracted from a variety of outside firms after the 2016 election. (The fact that these were subcontractors and not even Facebook employees is a whole OTHER issue). In the face of criticism for how Facebook handled their content, these moderators were brought in from across the country and given the unenviable job of removing offensive content from the site.

According to The Verge, which ran a big story that brought many of the Facebook contractor moderation issues to light, moderators for one of the firms, Cognizant, were paid as little as $28,800 annually to view images of rape, murder, suicide, and more. This was, to say the least, a terrible job and a dangerous environment for moderators who were expected to view this content daily with no regard for their mental health, and were subject to difficult working conditions.

So in September of 2018, Selena Scola led the first class action against Facebook, alleging that having to sift through content which included “broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder” caused PTSD in Scola so severe she could be triggered by a computer mouse. And she wasn’t the only one who had to do this.

Now, Facebook has agreed to compensate all members of the class with a minimum of $1,000, but moderators who can show a mental health diagnosis related to the unsafe environment which Facebook allowed can receive more. Anyone with mental health diagnoses can receive up to $6,000 and those who submit evidence of other injuries and costs could get up to $50,000.

This settlement is important, not just because it compensates the people who were traumatized by this work, but because it acknowledges that moderation and exposure to this kind of content is traumatic. Even today, it’s too easy for people to say something like “it’s just the internet, it’s not real,” but viewing and moderating an endless stream of horrors and hate is a very real danger to mental health and is not a job that anyone would have to do without proper counseling and safety measures in place.

What the Facebook moderators had to see, and likely what they still have to see is sickening, but not surprising given, well, everything about the internet. It’s a mirror of all of the world, including the very worst parts of it. And people who look into that horrible pit need to be protected and compensated for it.

(via: Boing Boing, image: Pexels)

Want more stories like this? Become a subscriber and support the site!

The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—

The Mary Sue is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Image of Jessica Mason
Jessica Mason
Jessica Mason (she/her) is a writer based in Portland, Oregon with a focus on fandom, queer representation, and amazing women in film and television. She's a trained lawyer and opera singer as well as a mom and author.