Lively and in-depth discussions of city news, politics, science, entertainment, the arts, and more.
Hosted by Larry Mantle
Airs Weekdays 10 a.m.-12 p.m.

How should Facebook define and deal with sexist hate speech?

A screenshot of the Facebook homepage.
A screenshot of the Facebook homepage.
Facebook screenshot

Listen to story

Download this story 6.0MB

Bowing to pressure from activists and advertisers, Facebook says it will try to police misogynistic content posted by users. What's still unclear is how the social media company will define offensive content and what steps will follow.

A recent campaign by women's rights groups drew widespread attention to Facebook pages promoting violence against women. They include graphic photos of abused women, right alongside clickable ads. Too close for comfort for advertisers, including Nissan, which pulled their ads from the site.

Now Facebook says it will treat anti-women sites the same way it treats racist content and other postings defined as hate speech. Critics of the move are concerned about a private company policing speech and defining "hate speech." Facebook has received flak for removing sites that promote atheism in the Middle East and for taking down images of breastfeeding mothers. As private entities, social media companies are not obliged to protect free speech, just their own bottom line.

What's more valuable for Facebook - providing a free-for-all venue for user-generated content or one with a low tolerance for distressing or controversial subjects?

Jaclyn Friedman, Executive Director, Women, Action and the Media (WAM!); WAM is an advocacy group that helped spearhead a campaign asking Facebook to treat misogynistic content as they treat hate speech

Jillian York, Director for International Freedom of Expression, Electronic Frontier Foundation - an advocacy groups that specializes in speech and privacy issues on digital platforms