News and culture through the lens of Southern California.
Hosted by A Martínez
Airs Weekdays 2 to 3 p.m.

What Facebook did to media literacy in America

This thumbs up or
This thumbs up or "like" icon at the Facebook main campus in Menlo Park, Calif., may soon have a neighbor. Founder Mark Zuckerberg said Tuesday that the social network soon would test a long-requested "dislike" type of button.
Robyn Beck/AFP/Getty Images

Listen to story

Download this story 12.0MB

It's been said that hindsight is 20/20.

While our view of what's transpired in this country in recent months may not be crystal clear, over the past week, many have raised questions about the proliferation of fake news, raising concerns about how inaccurate stories — widely circulated on sites like Facebook — might have influenced voters in the presidential election.

On Monday, the social network updated the language on its Facebook Audience Network Policy. Google banned faux news from using its online advertising service. For thousands, however, these changes come too late; it's decidedly more difficult to halt the spread of misinformation.

The bruising election has pitted the public against esteemed journalistic institutions, leading some to give more credence to partisan posts from pages like Freedom Daily and Addicting Info. Given these conditions, one is led to wonder: will news consumers ever agree on a set of facts, or is our media literacy past the point of no return?

Take Two asked two guests:


Facebook and Google might have been the vehicles in which people came across these stories, but then it's the people who like them and the people who circulate them who give them greater exposure. Laura, how much responsibility do we the public bear for this?

I think Facebook needs to do a better job, but I also think the news media can do a better job and the public can do a better job. Look where those stories are taking you: what are those sites? Are they credible? Can you do more research to verify and see if any other outlets are reporting that? That's a hard thing to ask of the broader public, which is why I think the solution starts with Facebook.

There could be a large percentage of the population that does this, Alexios, but you might have people who keep spinning it out. There might be concern that the voices of truth get outweighed or outnumbered.

Yeah, I'm sympathetic to the challenges that Facebook faces in trying to police content on the network. I mean, for one, people could be gaming the system and reporting content. Facebook does allow you to report content as fake, but they could be reporting content just because they don't like what is on there, not because it's fake.

Another thing I'd mention is that Facebook repeats the problem of hoaxes and the confirmation bias that preceded it. We had fake rumors and gossip long before the internet, and so in a way Facebook makes it more visible and measurable to everyone. Old wives' tales existed before, right? 

Press the blue play button above to hear the full interview.

(Questions and answers have been edited for clarity and brevity.)