Lively and in-depth discussions of city news, politics, science, entertainment, the arts, and more.
Hosted by Larry Mantle
Airs Weekdays 10 a.m.-12 p.m.

Are big tech companies providing the workplace services their content moderators need?




The Facebook website is displayed on a laptop computer on May 9, 2011 in San Anselmo, California.
The Facebook website is displayed on a laptop computer on May 9, 2011 in San Anselmo, California.
Justin Sullivan/Getty Images

Listen to story

18:53
Download this story 9.0MB

Are content moderators at risk of psychological trauma? A lawsuit filed by Selena Scola, a former content moderator at Facebook’s headquarters in Menlo Park, California, claims she developed post-traumatic stress disorder after sifting through “highly toxic and disturbing images” that violated Facebook’s terms of use.

Scola is one of roughly 7,500 content moderators around the globe that help delete graphic violence, self harm images, hate speech, and other forms of Facebook misconduct. According to the complaint, this content also entails watching videos and livestreams of “child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder.”

The suit, filed in San Mateo County Superior Court, is asking for a Facebook-funded medical monitoring program that would help maintain a safe workplace. The fund would provide content moderators with medical testing and psychiatric treatment. The lawsuit is seeking class action status. We discuss.

We reached out to Facebook for a comment, and as of the recording of this segment we have not received a response to our request. If they respond to our request we will update this segment 

Guests:

Steve Williams, attorney at Joseph Saveri, the law firm representing the plaintiff

Emanuel Maidenberg, clinical professor of psychiatry and biobehavioral sciences at UCLA