Lively and in-depth discussions of city news, politics, science, entertainment, the arts, and more.
Hosted by Larry Mantle
Airs Weekdays 10 a.m.-12 p.m.

PTSD, Harassment And Health Hazards: Inside The Working Conditions of Facebook’s Content Moderation Centers




The Facebook logo is displayed during the F8 Facebook Developers conference
The Facebook logo is displayed during the F8 Facebook Developers conference
Justin Sullivan/Getty Images

Listen to story

14:44
Download this story 7.0MB

In 2017, Facebook began opening content moderation sites in cities across America including Phoenix, Austin, and Tampa, with the ambitious objective of improving their moderation accuracy by delegating moderation duties to people familiar with American culture and slang.

What happened and what continues to happen inside those moderation sites is nothing short of a nightmare.

Facebook, who outsources their content moderation to companies like Cognizant, often sets high production goals, as high as 98% accuracy, which often pushes content moderators to their physical and emotional brink. One Facebook content moderator who worked for Cognizant at their Tampa facility actually had a heart attack at his desk and died last year in the facility.

Workers inside these facilities receive as little as $28,800 a year and each day only receive two 15 minute breaks and a 30-minute lunch, alongside a small “wellness” break of nine minutes when they feel they’ve been too overwhelmed by the content they’re moderating.

With regular exposure to some of the most graphic content on the internet, which includes everything from animal and child abuse to torture and murder, most workers subsequently develop post-traumatic stress disorder and other related conditions.

In addition to moderating the PTSD-causing content, workers within these facilities also have to deal with an unsafe and hostile working environment. There have been multiple reports of sexual harassment within the Cognizant facilities, as well as numerous instances of both verbal and physical fights among employees. To add on top of that, workers have also found pubic hair and fingernails at their desks, along with other bodily waste, and the Cognizant facility in Phoenix has been dealing with an infestation of bed bugs for the past three months.

Facebook says it is working on the problem and will conduct a thorough audit of its content moderating partner and will institute other changes to insure the well-being of its contractors.   

How do you think Facebook should handle content moderation? Is content moderation at all possible without the severe mental health consequences? Are you a current or former Facebook content moderator? Give us a call at 866-893-5722.

With guest host Libby Denkmann

Guest:

Sarah T. Roberts, author of "Behind the Screen: Content Moderation in the Shadows of Social Media" Yale University Press (June 2019); assistant professor of information studies at UCLA; Co-Director of the UCLA Center for Critical Internet Inquiry