6 Comments

Summary:

Internet censorship is once again in the news after a federal judge posited a proposed Washington law aiming to prevent child prostitution is likely unconstitutional under the Communications Decency Act. A bigger question is why free speech still reigns online except when copyright is involved.

shutterstock_95556718

Even child prostitution, it seems, isn’t a good enough reason to force internet service providers to monitor the content they publish. Citing conflicts with the Communications Decency Act, as well as various constitutional conflicts (including the First Amendment), a federal judge recently issued a preliminary injunction barring the state of Washington from enforcing a law that would force services like Backpage.com to personally verify the age of individuals offering sexual services in classified ads. The law would have made sites like Backpage.com, a notorious marketplace for sex sales, criminally responsible if their ads led to sexual abuse of a minor.

The decision is probably the right one given the language of the law but this case should serve as a sign that something’s wrong in how we prioritize online content. Why does the entertainment industry get a stick with which to beat web sites while child prostitutes are left empty-handed?

Laws should encourage free speech online

Whether we’re talking about classified ads, obscene material or copyright, the arguments on both sides are generally the same. Those who propose laws typically see a criminal enterprise and claim it’s time to regulate the platforms that enable these crimes to take place. Opponents say the bills will have unduly burdensome effects on providers by forcing them to monitor every piece of content that hits their servers. Alternatively, they say, such bills will chill free speech by encouraging providers to limit drastically the types of content they host in order to avoid the burden of monitoring.

The last time a proposed law — the Stop Online Piracy Act — tried to force web sites and service providers to monitor content proactively, the companies and web users it would have affected reacted fiercely. They were so outraged they blacked out parts of the web and launched crowdsourced movements to write new internet constitutions and influence internet policy. Fair enough.

Due to the nature of how the internet and web operate, it’s easy to side with those wanting to protect providers and web sites. Unless they’re actively encouraging the criminal behavior that laws want to regulate, it’s difficult to hold sites and services accountable for the activity (and complaints) of potentially millions of users.

Hence Section 230 of the Communications Decency Act, which generally exempts service providers from liability for user-provided content, even when providers are notified such content might be obscene or otherwise illegal. This is the statute on which the judge in the aforementioned case — brought by Backpage.com and the Internet Archive — centered his to decision to suspend implementation enforcement of Washington’s anti-child-prosititution law. (For a good explanation of the extent of this immunity from liability, and a lengthy hypothetical application to Wikipedia content, check out this 2006 article from the Harvard Law Review.)

Unless we’re talking about copyright

However, as anyone even casually aware of the Digital Millenium Copyright Act might be aware, not all content is created equal. That act’s widely cited “safe harbor provision” actually restricts a great deal of the immunity the CDA would normally provide sites like YouTube against claims of copyright infringement. In fact, the CDA expressly excludes intellectual property law from the scope of its coverage.

Under the DMCA, when service providers receive notice of allegedly infringing content, they must either undertake the effort to determine whether it’s legally infringing or just take the content down until/if the user who posted it rebuts the purported content-holder’s claim. This process can be terribly burdensome on service providers that don’t simply want to act as a rubber stamp for censorship by removing whatever content is contested. Indeed — as Google has showed time and time again — there are a lot of false, or at least questionable, claims filed under the DMCA.

If it works for copyright, why not prostitution?

It’s difficult to comprehend why it’s acceptable to impose burdens on service providers and potentially chill free speech in the name of preventing copyright infringement, but not in the name of preventing prostitution. Why should Facebook, for example, be forced to act upon a claim about someone posting a video without permission but not about someone trying to sell a minor for sex?

To be clear, the proposed law in Washington might be a bit extreme in all but requiring service providers to attempt to verify in person the ages of the advertised escorts. For a variety of reasons — including the global nature of the web and questions about jurisdiction — this is probably infeasible. The Washington law is also far too broad, potentially covering everyone from Backpage.com (the lead plaintiff in the case) to co-plaintiff and intervenor the Internet Archive.

It might not be infeasible, however, to require sites and service providers to examine somehow claims of child prostitution like they do copyright claims. (I suspect many already do in some cases, and almost all web site terms of service grant them the permission to remove objectionable content.) And if laws were rewritten to cover only sections on sites that advertise “escort services” or other clear euphemisms for prostitution, that’s certainly less burdensome than imposing requirements across every piece of content on the web.

Regulating content on the internet is a complex issue and attempts to do so in a meaningful manner often skirt the bounds of what’s constitutional. It’s unclear what methods for fighting a problem such as child prostitution would be both effective and legal. But it’s also debatable that the DMCA is a fair or effective law. If Congress thinks it’s alright to suspend concerns about free speech when it comes to the background song in a YouTube video, maybe doing so for allegations of child abuse isn’t such a crazy idea.

Image courtesy of Shutterstock user Rugierro S.

  1. This isn’t a “proposed” law in Washington state. It was passed by the legislature and signed by the governor. It’s now being challenged in the courts as you correctly report.

    Share
  2. “Why should Facebook, for example, be forced to act upon a claim about someone posting a video without permission but not about someone trying to sell a minor for sex?”

    Because the content providers’ lobbies care about the video posting but not the sex selling.

    Share
    1. Derrick Harris Monday, July 30, 2012

      Exactly.

      Share
  3. I do not understand the basis of the reality of this type of enforcement.
    My understanding is that if I rent a server (or server space) to post content, I am doing nothing more than renting space.
    If the server provider for posting content is doing nothing more than providing space for content for free or for fee, why should they be considered in the position of performing policing duties?

    Would this not be the same as an individual or a business renting commercial or private property (space) to an individual or to groups? I know of no rental businesses or rental individuals who are in the business of acting police. Granted, most commercial building provide security, but that is for the protection and safety of the rentors and the patrons of those rentors.

    If the commerical rentor is renting to a bookstore, a music store, or a call service, are they going to monitor every item in the stores for legal content, or even the call service for protitution? No, they are going to collect their rents as long as there is no physical destrution to the real estate.

    How are either of these situations any different from one another?

    Share
    1. Derrick Harris Friday, August 3, 2012

      The short answer is that they’re not too different (in theory, at least), which is why enforcement of WA’s law was enjoined. When it comes to copyright, though, if the provider gets notice of allegedly illegal material, it has to act.

      Share
      1. Free PUA Bootcamp! Wednesday, August 8, 2012

        What they should do is bring Section 230 in line with the DMCA for defamation and other content. It’s absurd that people can “Google-bomb” someone and ruin their reputation while Google can’t be sued for republishing any lie published by someone with an axe to grind. The right to reputation used to be valued so much that we could challenge someone to a DUEL TO THE DEATH if they defamed us. The DMCA proves that the takedown system works just fine. All someone has to do to put content back up is to file a counter-notice if they want. Even more chilling is that a CA ruling gives “home court” to the PLAINTIFF in a copyright case. If you violate someone’s copyright, they can sue you in THEIR jurisdiction because the injury occurred there, according to one ruling in the California federal courts.

        Share

Comments have been disabled for this post