People are lining up to sue sites like Yelp and Ripoff Report over their users’ misbehavior, but courts continue to slam the door in their face.
A new report shows the sites’ traditional legal shield is still strong, but that some are trying to use intellectual property laws to crack it.
In “2011 State of the Law Regarding Website Owner Liability for User-Generated Content,” Internet lawyer Catherine Gellis offers a helpful update of websites’ ongoing effort to fight off lawsuits created by their users.
Gellis found that websites’ core legal shield (Section 230 of the Communications Decency Act) continued to gain traction as courts last year again confirmed that businesses like auctioneers and consumer review sites can’t be sued over what their users do.
The legal shield, created in 1996 to ensure that the fledgling Internet economy was not brought down by lawsuits, works by ensuring websites are not responsible for obscene, defamatory or criminal acts of their users. The shield stays up as long as the sites don’t take an active part in their users’ activity — if they do, they lose their immunity and become instead content creators who can sued like anyone else. Overall, Gellis notes the shield may even be getting stronger — recent cases show sites like Yelp and Roommates.com are protected even if they curate content.
While the ongoing strength of Section 512 is good news for Internet companies, the bad news is that plaintiffs are trying even harder to use intellectual property law as a backdoor around it.
What this means in practice is that aggrieved individuals are gussying up libel complaints as copyright or trademark cases. Doctors and dentists, for instance, have been trying to use copyright law to force websites to take down negative reviews.
Such attempts to short-circuit websites’ legal shields are hardly new, of course. In Australia, for instance, a man sued Twitter last week after a media personality reportedly defamed him in a tweet. The lawsuit, which is likely to fail, is part of a long-running effort by companies and individuals to make Internet companies responsible for what appears on their platforms. Here is Gellis’ conclusion:
Parties looking to hold someone accountable for content will always be tempted to “shoot the messenger”; future cases will necessarily continue to explore the bounds of just how bulletproof they are.