Facebook’s content moderators take their battle for better conditions

Facebook’s content moderators take their battle for better conditions

SHARE IT

14 May 2021

The issue of how content moderators have been treated is facing Facebook again after a moderator told an Irish parliamentary committee that the company doesn’t care enough, as it should, for the protection of people working in this field.

Isabella Plunkett, who works for Covalen, told the committee that there isn’t adequate access to mental health resources for content moderators. For example, Covalen allows for an hour and a half of “wellness time” each week, but the company-provided “wellness coaches” are not mental health professionals, and are not equipped to help moderators process the traumatic content they often deal with. Plunkett told the committee that these wellness coaches sometimes suggested activities like painting or karaoke.

“The content is awful, it would affect anyone,” she said. “No one can be okay watching graphic violence seven to eight hours a day.” She also said that non-employee content moderators should be afforded the same benefits and protection as actual Facebook employees, including paid sick time and the ability to work from home.

A spokesperson said Facebook is “committed to working with our partners to provide support” to people reviewing content. “Everyone who reviews content for Facebook goes through an in-depth training program on our Community Standards and has access to psychological support to ensure their wellbeing,” also said. “In Ireland, this includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We are also employing technical solutions to limit their exposure to potentially graphic material as much as possible. This is an important issue, and we are committed to getting this right.”

The company agreed to make changes and introduced new tools that would allow moderators to watch videos in black and white and in mute, wanting to minimize the level of violence they face. He also added a feature that could give them the chance to skip to the relevant parts of longer videos and go straight to the important ones, saving time for employees. At the same time, the company has made investments in artificial intelligence technology in order to automate content moderation.

Facebook is expected to answer questions on whether these measures are sufficient to protect content moderators. The committee will ask representatives of Facebook and its affiliates to appear at another hearing to face questions about the treatment of their workers.

View them all