In the aftermath of the Parkland, Florida school shooting last month, the number one video trending on YouTube was a conspiracy theory accusing a Marjory Stoneman Douglas High School student survivor appearing in interviews, to only be an actor.
The video reached 200,000 views before YouTube removed it from its site.
But YouTube’s community guidelines on uploading conspiracy theories and misinformation have been murky, along with how it determines which hoaxes to crack down on. This week, YouTube CEO Susan Wojcicki unexpectedly announced during a SXSW panel with WIRED editor-in-chief Nicholas Thompson that the platform now plans to debunk erroneous content by linking videos to "information cues," such as text-based Wikipedia.
While the video-sharing site has received praise for taking steps to fight fake news, critics say that Wikipedia, which is a battleground of information in and of itself, is just a band aid for the very algorithms that disseminate misinformation on YouTube. We speak with a WIRED reporter who has been following the story as well as a fake news media expert.
We reached out to Google’s Press Team and received this response from a YouTube spokesperson:
We’re always exploring new ways to battle misinformation on YouTube. At SXSW, we announced plans to show additional information cues, including a text box linking to third-party sources around widely accepted events, like the moon landing. These features will be rolling out in the coming months, but beyond that we don’t have any additional information to share at this time.
With guest host Libby Denkmann.
S. Shyam Sundar, professor of media studies at Penn State and co-director of the university's Media Effects Research Laboratory; Sundar received a grant from the National Science Foundation in 2017 with colleague Dongwon Lee to create machines that can detect fake news