Lively and in-depth discussions of city news, politics, science, entertainment, the arts, and more.
Hosted by Larry Mantle
Airs Weekdays 10 a.m.-12 p.m.

Down The Rabbit Hole: The Dark Side Of YouTube’s Automated Recommendation System




The dark side of YouTube
The dark side of YouTube
Dado Ruvic / Reuters

Listen to story

20:51
Download this story 10MB

Earlier this week, an article from the New York Times outlined the “rabbit hole effect” of YouTube’s algorithms and pointed out how the platform may recommend videos of young children to viewers who had previously viewed sexually themed content.

Researchers attribute this to what some studies call a “rabbit hole effect” an online viewing trend that leads people on platforms like YouTube to incrementally watch more extreme topics or videos, which then hook viewers in.  However, YouTube has stated that removing their automated recommendation system, which they say drive up to 70 percent of views by suggesting what users should watch next, altogether would only hurt its content creators who rely on the system.

Shortly after the New York Times article was published, YouTube released a statement on their official blog, outlining recent policy and algorithm updates that protect minors and families. Some of YouTube’s new policies included restricting live streaming features for younger minors unless they are clearly accompanied by an adult as well as limiting recommendations of content that features minors in risky situations.

Earlier this year, YouTube had also announced that it would turn off comments on nearly all videos featuring kids, following a similar controversy in February where pedophiles were leaving inappropriate comments on children’s videos.

Today on AirTalk, we’ll discuss the role technology and online platforms like YouTube play in content moderation and public safety with the co-director of the Center for Scholars & Storytellers as well as one of the researchers who ran an experiment on how YouTube’s algorithms direct its users.

Should parents be responsible for the content they're putting online of their children or does YouTube need to step in? What do you think? Give us a call at 866-893-5722.

Guests:

Jonas Kaiser, affiliate researcher at Harvard’s Berkman Klein Center for Internet & Society; he is one of three researchers who conducted an experiment on how YouTube’s algorithms direct its users; he tweets @JonasKaiser

Sierra Filucci, editorial director at Common Sense Media, a media literacy nonprofit in San Francisco; one of her areas of expertise is educating parents on media and social media use

Suresh Venkatasubramanian, professor of computing at the University of Utah and a member of the board of directors for the ACLU Utah; he studies algorithmic fairness; he tweets @geomblog