13 Comments

Summary:

Google says it blocked viewers in Egypt and Libya from seeing a controversial video clip on YouTube, after the video was allegedly linked to violence in both of those countries. But should Google be censoring content without even a request from a government or court?

3083210411_d3e9895715

After violent attacks on Americans in both Egypt and Libya — including an attack in Libya on Tuesday that killed the American ambassador to that country — Google said on Wednesday that it has restricted access to a controversial YouTube video about the Prophet Muhammad that has been linked to the violence. According to a statement from the company, the video is still available on the YouTube website, but viewers from both Libya and Egypt are unable to see it. While this may be a goodwill gesture by the search giant aimed at helping to douse the flames of anti-American violence in the Middle East, it raises a number of questions about the company’s willingness to censor certain types of content even when it has not been asked to do so by a government or court. What other things might Google decide to block, and from whom?

The clip that is being blocked is a 14-minute section of a longer film called “The Innocence of Muslims,” which reportedly shows a fictional attack by Muslims on a Christian family, followed by an account of the origins of the Islamic religion that portrays the prophet Muhammad as a fraud and a womanizer. Other fictional and/or humorous accounts of the prophet’s life have also caused violence in the past, including a fatwa or death sentence issued against author Salman Rushdie in 1989 for his book “The Satanic Verses,” and a series of attacks and deaths linked to offensive cartoons about the prophet that ran in a Danish newspaper in 2005.

It’s not clear the video is connected to the attack

In this case, the video clip has been connected to the death of U.S. ambassador Christopher Stevens, who was killed on Tuesday in an attack on the embassy in Libya, along with three other members of the ambassador’s diplomatic staff. And in a statement released to the news media, Google made it clear that this is the main reason it decided to block access to the video from viewers in Egypt and Libya (attacks also occurred in Cairo that were linked to the clip). Said the company:

“This video — which is widely available on the Web — is clearly within our guidelines and so will stay on YouTube. However, given the very difficult situation in Libya and Egypt, we have temporarily restricted access in both countries. Our hearts are with the families of the people murdered in yesterday’s attack in Libya.”

However, while many of the reports from mainstream media sources about the deaths in Libya have linked it to the video, CNN has said that the embassy attack was actually planned well in advance by members of an extremist group connected to al-Qaeda and was not directly connected to the clip, according to the news network’s sources. As more than one person has pointed out, blocking access to a video from a specific country is also quite easy to get around, even for a technically-challenged viewer — and as Google itself noted, the offending video is available on any number of other websites apart from YouTube. So why bother censoring it?

In the past, Google has fought hard against attempts by governments in countries such as Turkey to censor the content on YouTube, and in many cases those countries have responded by blocking the website entirely (as Afghanistan said it had done on Wednesday in response to the Muhammad video). The company maintains a database of these kinds of requests from governments as part of its “transparency report,” and even when it does agree to remove certain kinds of content from either YouTube or its search results — as it does in countries like Germany, where Nazi-related commentary is illegal — it does so under protest.

Should Google alone be making the decision to censor?

Jillian York, the director for international freedom of expression at the Electronic Frontier Foundation, said in an email to me that allowing even controversial videos like the Muhammad clip to remain online was an important principle for Google and YouTube to uphold, despite the connection to violence:

“It definitely troubles me… I think it’s wrong of Google to play Internet police here. They shouldn’t censor without a court order.”

The Libyan video case reinforces how much control companies like Google and YouTube have over what kinds of content we can see and when, and more importantly where. Even Twitter said earlier this year that it has the ability to block access to specific tweets on a country-by-country basis — although the company said that it would only exercise that power as a last resort when asked to do so by a court or government. As we’ve discussed before, this kind of control over information in the hands of a few corporate information gatekeepers raises a host of important questions about freedom of speech in a digital age.

Google’s decision to block the video clip may have been made with the best of intentions, but if the connection between the violence and the video is as flimsy as it seems — and if no government, court or other external authority has requested that it be censored — then why take this kind of step in the first place? All it does is highlight the fact that the company can remove or block content any time it wishes to, regardless of whether doing so is ethically or legally justifiable. And that is a troubling prospect indeed.

Post and thumbnail images courtesy of Flickr users Hoggarazzi and Petteri Sulonen

  1. Um, “censoring” is typically applied to governmental entities. Private businesses and corporations such as Google are free to show, and especially not show, whatever they want.

    Share
    1. Exactly. If Google wants to pull a clip, they can do that. if you want to call it “censoring,” that’s your term for it.

      Share
  2. When did journalists and the EFF stop having a soul? Does it mean nothing that these people have a family and it sounds a lot like Google is trying to respect them right now!?

    Share
  3. When did journalists and the EFF stop having a soul? Does it mean nothing that these people have a family that Google was trying to respect right now?

    Share
  4. The answer lies somewhere in Google’s terms of service. Maybe the video breached specific guidlines about “hatred” or maybe the ToS reserve the right to do what they want. Google is not a government,

    Share
  5. Will they also not provide links to Jackass-type videos? Anyone who searches for examples of violence can find it, even though the person doing the searching may be only ten years old. Some things do not need to be protected under free speech laws.

    Share
  6. Reblogged this on #Hashtag – Thoughts on Law, Technology, the Internet, and Social Media and commented:
    Should Google be censoring videos just because they are linked to violence?
    Google says it blocked viewers in Egypt and Libya from seeing a controversial video clip on YouTube, after the video was allegedly linked to violence in both of those countries. But should Google be censoring content without even a request from a government or court?

    Share
  7. It was definitely an impulsive but useless move as the majority of radicals haven’t even watched the video itself, it was just one more motive to show their hatred for people of other beliefs and and specifically the U.S.

    Share
  8. YouTube removes videos they don’t like all the time. Is it wrong? Yes, I think it is – but Google can do whatever they want because they own the company. But until you yourself have a video taken down by them, you’ll never understand how violent it feels when it happens.

    And if you’ve never had a video taken down – then you’ve never really tried to do anything.

    Share
  9. Great question you pose here.    As I listened to reports of violence directed at our Embassies over the past couple of days,  I wondered if YouTube would take action to curate (or censor, depending on your viewpoint) their content.   

    Yes, YouTube/Google does have the  right to accept, reject, and delete any content as they see fit.    As a mother of a  teenager (and a Mom who regulates content), I am quite familiar with the pile of junk that’s on YouTube and am careful to ensure my kid’s not contributing to it and consuming content that has some value.  There is a lot of educational material there that has great merit.

    I think the question is really about whether the company should introduce some QUALITY CONTROL over what gets dumped on their site?  It’s much easier and cheaper not to do so and simply let anyone upload whatever they like without regard to value or accuracy of that content or the likelihood that it might incite violence.  How about those videos of kids being bullied, beaten, and even killed.    My point is, that stuff simply shouldn’t be there.  There has to be SOME QUALITY STANDARD.   

    Recent research shows that the majority of teens use YouTube as their main source for news and entertainment.   But, really, how credible is YouTube as a news source?   After all, network television has SOME quality standards in programming.  iTunes maintains some quality standards, too.   So,  why shouldn’t YouTube set a bar on standards for content?   I think there is a big difference between  censorship and curatorship, but the bottom line is, a media company does have the right (and some might say the obligation) to control quality.

    Share
  10. Very interesting.

    When an artist offends Muslims, our government condemns the artist. But when an artist offends Christians, our government FUNDS the artist (see Robert Mapplethorpe and Andres Serrano)! And funny how the PC-police who want to curb “hate speech” when it offends Muslims are perfectly happy to cheer that speech if it offends Christians (again, see Mapplethorpe and Serrano).

    Gotta love it!

    Share
  11. What a weakly-written argument: the video in question is a direct attack on the beliefs of over 1.4 billion people, and is filled with fallacies and hate speech. It not only violates Google’s ToS, but also US Hate Speech Laws.

    Share
  12. Let judges decide what can be published a posteriori if there are complains, Otherwise we are admitting censorship and it is the end of democracy as we know it.

    Share

Comments have been disabled for this post