Regardless of who produced the low budget video “Innocence of Muslims,” the international domino effect that it has produced clearly illustrates the power of social media websites like YouTube. But with power comes responsibility. Does Google, who owns YouTube, have a responsibility to monitor the millions upon millions of videos uploaded by users for content that may be offensive to viewers?
The free speech issues involved in answering this question get complicated very quickly. Every person has a different threshold as for what they find offensive and it would be impossible to filter all offensive content from being posted on the Internet. "We work hard to create a community everyone can enjoy and which also enables people to express different opinions," a YouTube spokesman maintained in a statement. "This can be a challenge because what's OK in one country can be offensive elsewhere. This video -- which is widely available on the Web -- is clearly within our guidelines and so will stay on YouTube."
What basic responsibilities, if any, do Google and other internet companies have when it comes to monitoring potentially offensive videos and other content on the internet?
Dan Gillmor, digital media professor at the Walter Cronkite School of Journalism & Mass Communication at Arizona State University