35 Comments

Summary:

The removal of the Facebook page belonging to film critic Roger Ebert and blocking of content from a protest group in Britain has raised questions about the site’s censorship of content. We treat Facebook as a semi-public space, but it is controlled by a private company.

Updated: The benefits of being on Facebook are fairly obvious by now: you can connect to friends and family and share things with them no matter where they are — and it’s all free! This quasi-public space is also owned and controlled by a corporate entity, however, and it has its own views about what kinds of behavior should be allowed. That inevitably raises questions about whether the site is engaging in what amounts to censorship — questions that resurfaced this week after a page belonging to film critic Roger Ebert disappeared, and a group of protesters in Britain found their content blocked. Who is watching the watchmen?

Ebert triggered a storm of criticism on Monday with his response to the death of Jackass co-star Ryan Dunn, who was killed in a single-vehicle accident early Monday morning. Police said that speed was likely a factor in the crash, and there have also been suggestions that the TV actor may have been drinking before the incident. Ebert — who took to Twitter after a cancer operation led to the loss of his lower jaw, and now has 475,000 followers — posted that “friends don’t let jackasses drive drunk,” a comment that drew attacks from Dunn’s co-stars and from celebrity blogger Perez Hilton.

The film critic later tweeted that his page had been removed (even though his comments on Twitter about Dunn never actually appeared there), to be replaced by an error message stating that the page had been removed due to violations of Facebook’s terms of use, which ban any content that is hateful, threatening or obscene or that attacks an individual or group. In response, Ebert said his page was harmless and asked: “Why did you remove it in response to anonymous jerks? Makes you look bad.”

Facebook later said that the page was taken down “in error” and it was reinstated. But as Jillian York of the Electronic Frontier Foundation and Global Voices Online noted in a blog post, it’s not clear what kind of error led to the page being removed. Was it taken down automatically after being flagged as abusive? York — who has written in the past about Facebook removing pages set up by political dissidents in the Middle East and elsewhere — says the company has denied removing pages automatically. So was there human error involved? And if so, what steps is Facebook taking to prevent that in the future?

If critics of his Twitter comments attacked Ebert’s page by repeatedly flagging it, they effectively took the same approach some governments have taken in trying to shut down dissent: Foreign Policy magazine columnist Evgeny Morozov said recently that he knows of at least one government that flags dissident group pages as pornography in order to get them removed. Facebook has also removed pages in the past that were seen as anti-Islam or anti-Israel — in some cases reinstating them later — and has taken down more innocuous content as well, such as pages about the benefits of breastfeeding.

And it’s not just taking down pages that Facebook users are concerned about: According to a blog post from one of the organizers of a recent public anti-government protest in Britain, a number of users reported that Facebook not only blocked them from linking to a website set up by the group, but from linking to a blog post about it as well. A spokesman for the social network said this too was an error that was later corrected — but, again, what kind of error it was isn’t clear. Nor is it clear what criteria Facebook uses to make these decisions.

As the British blogger notes in his post on the incident, Facebook is “increasingly the space within which people receive their information, including civic information.” We are living more and more of our public lives and getting more of our information through networks such as Facebook, and while that can be a very powerful thing — as we’ve seen with events such as the Arab Spring uprisings in Tunisia and Egypt — it also means that more of our information is being filtered by a corporate entity, with its own desires and rules, not all of which are obvious. The implications of that are profound.

Update: Facebook spokesman Barry Schnitt responded to this post via email, and his comments about the specific incidents mentioned (both in the post and in the comments here) are below. They have been edited for length.

  • Ebert — “Difficult to say. Perhaps the reviewer misinterpreted [the term] “jackass” or the other fan comments as a concerted personal attack on a private individual. Perhaps they just meant to push “ignore” and hit the wrong button. In the end, we responded to the mistake pretty quickly.”
  • Britain/j30strike — “The domain jstrike30.com, when originally registered, redirected to an affiliate site that’s associated with spam and other malicious content. We block many of these sites to protect users. In this case, we blocked the affiliate site and, when we found that this domain redirected to it, we blocked it as well. Soon after being registered, however, the domain was changed to host the non-spammy content that’s there now. Our system didn’t pick up on this, and it remained blocked. As Dave explained, we’re already working on improvements to prevent this type of mistake in the future.”
  • Third Intifada — “As we explained publicly around this controversy, the page was reported many many times but we kept it up because it was an explicitly peaceful protest, even though it used a word — “intifada” — associated with violence in the past. However, the page eventually became overrun with calls for violence and was removed.”
  • Breastfeeding — “No policy against this at all. I’d encourage you to search the site. There are dozens of groups and thousands, if not millions, of breastfeeding photos to be found. We have removed some pictures of naked women who happen to be holding a baby. And, of course, I’m sure we’ve made some mistakes.”
  • Trademark comment — “Abuse of DMCA and other intellectual property notice procedures is a challenge for every major Internet service and we take it seriously. We have invested significant resources into creating a dedicated team that uses specialized tools, systems and technology to review and properly handle intellectual property notices. This system evaluates a number of factors when deciding how to respond and, in many cases, we require the reporter to provide additional information before we take action. As a result of these efforts, the vast majority of intellectual property notices that we receive are handled without incident. However, we are always striving to improve our practices. If your reader sends his info, I’ll have someone look into it.”

Post and thumbnail photos courtesy of Flickr user David Reece

You’re subscribed! If you like, you can update your settings

  1. The first sentence of this article is pretty much what email is.

    1. Not really, but I can see your point — and just imagine if the corporation that sends your email decided not to send certain messages because of their content.

  2. Sadly, we now live in a society where 5% of the population dictates to the rest of us what we can see, say, and hear. Special interest groups who represent every minority (I’m talking beyond even ethnicity) “flavor” is out there extorting our civil rights with their loud mouths, access to media, and propaganda machine. I got permanently banned from Hoff Po for daring to submit a comment walling Al Sharpton a racist ambulance chaser. They all claim they hate “intolerance”, yet they wont tolerate al opposing stance. Sad times.

  3. Lucian Armasu Tuesday, June 21, 2011

    This is why we need decentralized Facebook-like and Twitter-like services. These types of services are becoming increasingly more important, and yet they could take down your account without you being able to do much about it.

    1. I agree, Lucian — the problem is that those services exist (or close to it with Diaspora and Status.net) but no one wants to use them because all their friends are on Facebook and Twitter :-)

      1. I think it’s more like people don’t want to use these because they need to be set up and that isn’t so easy for non-techy individuals.

        I still stand by the centralized social network concept. It just needs to be run fairly with a bias toward free speech. It’s something my partner and I are working on.

    2. FYI, Dave Winer is working on (and he and others are already using) a tool called Blork: http://scripting.com/stories/2011/04/05/gettingStartedWithBlork.html

      To Mathew’s point that these tools aren’t being widely used because everyone is on Twitter and Facebook, that’s probably always going to be true. So our alternatives need not to be separate social networks where one has to basically start all over. The alternatives need to be such that one can still feed content to Twitter and Facebook where all of one’s friends are while still maintaining control over that content on one’s own server or some other such method. So the end result would be being on Twitter and Facebook without ACTUALLY being there. Know what I mean?

  4. francine hardaway Tuesday, June 21, 2011

    The control Facebook has over communications is of increasing concern to me. Never mine the obvious, like shutting down pages; I don’t think I am seeing people with opposite political views from my own, or people FB doesn’t think I “should” see. I really notice this because sometimes I see ONLY my Arizona friends, and not those from SV. I don’t want my feed curated; I have friends in both environments for a reason.

    1. Control over your feed is definitely an important thing to have.

  5. I think that “WE” need to have competitors to all of the Big Co silos like FB and Twitter . Until there is viable competition members feel they have no choice if the disagree with the Big Co. In my view these competitors should be owned fully or partially owned by the “Public”. In a way these applications would be “Public Utilities” who’s primary goal would be to provide services in the “intrest” of the public.
    We have been working on bringing this concept into reality and will have some announcements shortly. We also have been working on several applications that have the possibility of becoming “Public Utilities” The Social network is in alpha here http://social.maia68.com. There are some bugs we are trying to fix them as fast as we can.
    We are also working on somethings here http://www.maia68.com and here http://www.kleemi.com

    1. Frankly, I think, that is a horrible idea. The government has a huge incentive to censor and would if it were treated like a public utility. Maybe you mean something else by that, I don’t know. Centralized control is the problem and while I have concerns about Facebook that keep me off the site, it’s way better than letting the government control it. Facebook can’t do much to hinder it’s competition while the government could outlaw competition like it did with the Post Office, which was used to censor.

  6. parallel internet will be a need soon .. facebook, twitter, both needed, and both unreliable.

  7. William Carleton Wednesday, June 22, 2011

    It’s good that things like this happen because they warn all of us to not be lulled into the sense that the proprietors of Facebook would ever allow the service to be turned into a public commons. Some irony in the fact that the Facebook service illustrates many of the things a digital commons might be about, but, whatever that future is, Facebook won’t be it. Too centralized; too lacking in imagination; to bound to the service of a tired, corrupt industry (advertising) that the commercial side of a truly social media would free us from.

    But it’s a private business, so the only fair remedy to the problem is for individuals to choose with their eyes open.

    The far more difficult problem, I think, is what to do about data centers and cloud services that really do function as de facto public utilities, when (a) they don’t like content they are hosting, and (b) government wants to search or seize a server and doesn’t know how to distinguish between the suspect and it’s virtual neighbors? That’s also infrastructure for . . . well, everything: commerce, education, the arts, politics, everything.

  8. Vadim Lavrusik Wednesday, June 22, 2011

    Hi Mathew, You ask several times about the criteria that is used, and our terms of service are public and available: https://www.facebook.com/terms.php?ref=pf

    We seek to create a safe environment for all users and investigate reports from users about violations of the terms. In those cases, the Pages are sometimes frozen while the claim is investigated to ensure that malicious behavior is taking place. Calling it “censorship” is not only misleading, but grossly inaccurate.

    1. Where in your terms of service does it list the criteria used?

      There pretty clearly was no violation of copyrights or privacy in Ebert’s post. Apparently, you pulled it based on what you thought might be offensive content. That IS censorship.

    2. Vadim, don’t be so quick to run from the term! Censorship is in fact what one would naturally do to enforce terms of service that pertain to content, which yours do. Suggestion: just be very clear about what the criteria are at Facebook for the censorship that obviously can occur (your terms state that you can deny service to someone who violates the standards).

    3. Thanks for the comment, Vadim — I’m aware of the terms of service, but it’s still not clear to me which of those were infringed by either Roger Ebert or the British group in question, or for that matter which are infringed by pages or groups devoted to breastfeeding and the like.

      And while censorship may not be the term you might choose for what happens when you remove that kind of content, I think in purely functional terms that is what is occurring — even if there is justification for it, or if it breaches Facebook’s terms of service etc.

    4. Grossly inaccurate? Ahem. When Facebook takes down the Pages of Roger Ebert and Sarah Palin (in her case, last year, it was just a note), it’s still censorship. Because they’re famous, and their stories are publicized, it gets fixed. But when it’s an activist or ordinary user, the accounts often remain down, and Facebook’s persistent notice that “this decision cannot be appealed” means that users are unlikely to even attempt to get their accounts back.

  9. Thanks for the post. I work on Facebook’s User Operations team focusing on our content policies and how we enforce them. I know that there are lots of questions and some confusion about our policies and practices around content. We’ve spent a lot of time trying to be transparent on our site (http://www.facebook.com/communitystandards), on our blog (https://www.facebook.com/blog.php?post=403200567130), and through the press (http://www.nytimes.com/2010/12/13/technology/13facebook.html). However, we can always do more, and we’d really appreciate your help in that effort.

    Generally, we’re very proud of the environment on Facebook. We want Facebook to be a place where people can openly express their views and opinions, even if others don’t agree with them. That said, there are some types of content we don’t allow. For example, we remove hate, threats, pornography, attacks on private individuals, and other kinds of abuse when they’re reported to us. We also work to prevent spam. You’ll find a more complete list of what we don’t allow, as well as an explanation of the community we’re trying to build, on the Community Standards page mentioned above. We’ve developed these policies with guidance from our Safety Advisory Board and other recognized experts from around the world.

    Unfortunately, there aren’t many examples for Facebook to follow in this area. We simply face a challenge that few companies have ever faced – protecting a service that’s under.
    constant attack and through which over 500 million people share more than 30 billion pieces of content every month. Our international team of reviewers works across offices in California, Texas, Ireland, and India to review hundreds of thousands of pieces of reported content every day. Our engineers build systems that classify over ten billion actions (suspicious logins, friend requests, etc.) and pieces of content (messages, Wall posts, etc.) every day. The hundreds of people and sophisticated systems that enforce our policies stand in stark contrast to the vast majority of the rest of the Internet.

    Of course, no system is perfect, and we do sometimes make mistakes like the two you mention above. Usually, it’s because a system didn’t work as intended or someone simply made a wrong call. In the case of the Page, one of our reviewers mistakenly identified it as an attack on a private individual, which we don’t allow. In the case of the link, our system mistakenly identified it as redirecting to a known spam website.

    As I mention above, there are some common misconceptions about our policies and practices. For example, we found long ago that the number of reports on a piece of content is a very unreliable signal for determining abuse. In fact, in many cases, the number of reports a piece of content receives is inversely related to the accuracy of those reports (imagine how many reports a Page criticizing Justin Bieber might get, for example). Because of this, number of reports has no influence on our decision about content. None. Instead, we evaluate it based on whether it violates our policies.

    We strongly believe that what is more important than any mistakes is our response to them. When we do make a mistake, we act quickly to fix it (both of the above issues were resolved within hours), apologize, and improve our processes so that it doesn’t happen again. In fact, our team has already started discussing improvements to prevent cases like the above in the future. Still, though, we believe our error rates rival those of any company in any industry.

    We take all of these responsibilities very seriously. We don’t pretend to have all the answers, though, and we welcome constructive feedback on how to improve our approach. Please don’t hesitate to contact my colleagues in PR if you have any suggestions. Thanks.

    1. Thanks for the comment, Dave — I know that the job of moderating billions of pages and comments can’t be easy. And I’m encouraged to hear you say that the raw number of complaints or flagging of content has no bearing on whether content is removed. But I’m still curious about why Ebert’s page would have been blocked even for a short time, and why the protesters in Britain would have found their links even to an innocuous blog post blocked.

      I’m also curious about how Facebook interprets community standards, since that kind of thing differs so much from country to country and community to community. I’d be interested in hearing more about that — you can reach me at mathew@gigaom.com.

    2. One of our Facebook Pages was removed without warning because somebody complained to Facebook that it violated their trademark. We couldn’t find a single lawyer who thought the complaint was valid, but Facebook presumes we are guilty until proven innocent. The user operations team basically told us to come back once we had a legal judgement. Censorship and arbitrary standards regarding trademark and copyright enforcement are real risks when doing business on someone else’s platform. Dave I appreciate your comments and would be very grateful if you’d review our case. Thanks.

      1. Thanks for pointing that out, Tom — that’s a real minefield for companies, I would think, and for Facebook as well.

    3. Here’s a question: Why, when someone’s account is taken down, do they receive a message stating that the decision “cannot be appealed.” Surely, in the case of a Sarah Palin, Justin Bieber, or Roger Ebert, the problem is easily solved, but the vast majority of individuals who lose their accounts for a TOS violation–real or in error–don’t know how to appeal the decision, and are informed by your User Operations team that they cannot do so. How about fixing that?

      1. That’s a great point, Jillian.

  10. Why does Facebook’s fine print claim it owns all data its users put into it? Why does Facebook summarily close accounts and destroy social networks after acting as judge, jury, and executioner, with zero due process of law, then keep all the data on the accounts for its own use? Why is there no due process of law in the first place? Because Facebook is above the law, and is evil. Why do they think they can close accounts in the first place? If some material somebody posted is objectionable, why don’t they just delete that and leave the account going? Why don’t they restrict posting privileges after multiple violations? Where’s the appeal process when somebody’s account is completely closed and can’t even log on to send an appeal?

    Save them the trouble and close your account now and boycott Facebook forever!

  11. Joseph Payton Thursday, June 23, 2011

    Facebook has every right to sensor anything they would like. It is private property.

    Are we forgetting that everyone who uses the service is doing so willingly and without any cost or ownership? (unless of course, they own actual shares of the company)

    In social networking the users are not the consumer, they’re the product.

    1. WORD. For more on this, here’s a great piece: http://technosociology.org/?p=131

    2. Good point, Joseph — and of course Facebook has the right to do whatever they wish. I just think more people should keep that in mind.

  12. Sean A. Flesch Thursday, June 23, 2011

    I see a lot of complaints (eg. http://bit.ly/mnceWF) from nursing moms about breastfeeding photos and even personal pages being deleted. Facebook needs to better train their people on this particular issue. Even if the pages come back, its not easy to fight the decision, not to mention the loss of time the user has to deal with their page being down.

  13. Facebook is not a public space. It’s the online equivalent of Disneyland – a massive, corporate-run, private space with lots of rules and controls that, if broken, can result in your immediate expulsion. One thing that’s always bugged me about Facebook is that you can only “like” something and not “dislike” it (hence the Disney analogy). I prefer to live outside the walls of Facebook, just as I’d rather visit New York, Chicago, New Orleans, Paris, Barcelona, Beijing, Delhi, or a thousand other great places before I’d ever set foot in Disneyland.

  14. Several commentors have noted that because Facebook is a private company it has the right to do whatever it wants. But that is not fully the case in US law.

    A good analogy is the rights and responsibilities of a landlord. Depending on where you live, landlords have a set of responsibilities, including avoiding discrimination based on race or religion, keeping the unit in good repair, not entering the unit without notice, and following a process for eviction.

    These laws were put in place because renters organized to create these legal rights. The same process could happen with services like Facebook, which acts as a landlord from which we rent virtual space. There is a good argument to be made that we need to create the online equivalent of tenants rights.

    A summary of California laws:
    http://www.dca.ca.gov/publications/landlordbook/index.shtml

    1. aslevin,

      let me assume that you eat with your arse as your comment clearly shows that you are a clueless facebook drone.

      Facebook owns whatever you post on facebook. period. no if and or but.

      You can not even deleted your account.

      proved my two points wrong and I can called you daddy.

      otherwise, your theory does not apply here.

  15. CassieSunset Monday, June 27, 2011

    1. Censorship is abhorent.
    2. Privacy in Facebook is a joke.
    3. The real question: who is using ‘Facebook accounts database’ to keep records on individuals?
    4. History shows us that governments can’t be trusted with our personal details.

    Maybe it is time for people everywhere to insist on their rights or dump social media sites that can’t or won’t do the right thing…

    Personally, I wouldn’t touch Facebook with a barge pole. I strongly urge people everywhere to place a very high value on their own personal details or personal photos, and be very wary of pasting those details anywhere on the internet.

  16. I run a well-established blog which is 10 years old. Recently I learned that Facebook members cannot link to my idiotprogrammer blog because it is “abusive.” I have no idea what that means or how my site could possibly be interpreted as abusive. Even I can’t link to my own blog! Facebook has a mechanism for notifying FB if you think its block is mistaken; however a week has gone by and I have received no word about it. I for one would like to know WHY FB regards my site as offensive. I have several thousand posts; I seriously doubt that any of the posts are offensive — unless someone reported it for political reasons (I am a liberal and climate change activist).

    If someone reported my site as offensive, I’d be curious to know which FB member did so. If I had already posted the link on my own profile, shouldn’t Facebook be obligated to notify that it has taken measures against me?

    The Facebook help pages don’t mention anything about how to get this kind of situation resolved.

    Finally I find it interesting that Dave Willner (the commenter who worked in FB User Operations) said “Please don’t hesitate to contact my colleagues in PR if you have any suggestions. Thanks.”

    At the same time, he didn’t leave contact information or how to find out information about why a member’s own domain might have been blocked. As well-mannered as the response seems to be, it seemed to reassert the right of Facebook to arbitrarily block websites without needing to explain itself or answer to the site owner.

    I don’t even have a firm idea about whether FB will contact me with more information or make a
    decision or leave me permanently in the dark.

    Robert Nagle
    http://www.imaginaryplanet.net/weblogs/idiotprogrammer/
    idiotprogrammer at gmail

  17. You get what you pay for.

    Free services have certain restrictions. They have the right.

Comments have been disabled for this post