3 Comments

Summary:

The British government is considering once again whether to apply mandatory filters to block all adult content on the internet. But what’s the point of campaigning for a technological solution when the technology itself doesn’t work properly?

illusion-shutterstock-olly

Britain is, once again, looking at the possibility of applying pressure on internet users to filter out pornography — a policy loved by politicians and disliked by internet providers that, like a cat with a hairball, comes up every few months.

As the BBC reported last week, the U.K. government is circulating a discussion paper which looks at a range of censorship options, including full, mandatory adult content blocks for everyone. But the preferred option appears to be a system that is one step down from a total ban. The paper proposes a service that — by law — automatically puts a content filter into place for new internet subscribers, but gives them the option to lift the block later.

The latest system, called “active choice-plus”, is aimed at reaching a compromise.

It would automatically block adult content, but would set users a question, along the lines of whether they want to change this to gain access to sites promoting pornography, violence and other adult-only themes.
This is partly based on “Nudge” theory, a US concept which states that persuasion, rather than enforcement, can be an effective way of changing behaviour. Downing Street has set up a unit to explore such ideas.

The consultation had already been trailed earlier this year, but even so it has given some civil liberties advocates a reason to get their placards out to protest.

“Giving parents tools is great, but ‘Nanny State’ filtering using government approved technologies is bound to fail the people it is designed to protect,” said Jim Killock, executive director of the Open Rights Group. “This is a Government looking for headline grabbing solutions to complex solutions. They need to think again.”

It’s worth remembering that nobody here is suggesting that children should be forced to stare at hardcore adult videos online for hours on end. This isn’t some mirror of A Clockwork Orange. But still, the pro-censorship campaign has driven on with its family values message, circulating a petition in favor of mandatory blocks with that now has 100,000 signatures.

Forget about the rhetoric or the controversy or the guiding philosophy, though. There is a much more simple — and much more real — problem facing the U.K.’s proposals to make porn filters opt-out.

The porn filters they’re talking about are absolutely terrible at doing the job they’re being asked to.

Here’s how it works

Given that any filters which are put in place will need to be built and operated by ISPs, I contacted Britain’s four biggest internet providers to ask them about the filtering services they have in place already. And the reality is that their methods are all very similar — and widely criticized.

BT, which has more than 6 million subscribers in Britain, told me that it uses already uses “active choice”, meaning it presents new users with an unavoidable choice over whether to apply parental controls or not. And if people do opt for controls, they’re subject to a software-based system developed by McAfee that runs alongside another ISP-level filtering system, Cleanfeed.

“It filters out 35 categories of content, including porn, adult, suicide, anorexia etc,” said a spokesman. “The blacklist, as such, is developed by McAfee.”

Sky also operates a McAfee-based system, while Virgin Media uses one developed by Trend Micro.

TalkTalk, meanwhile, uses a network-level filtering system called Homesafe, which was launched a little over a year ago. It, too, works on a categorization basis. You can ask it to filter out all sorts of things, based on a range of broad subjects: websites about drugs and alcohol, gambling, games, pornography and more.

The categorization and filtering is based on technology from Huawei Symantec, the joint venture between the Chinese electronics company and the American security company.

But the problem with these services — and TalkTalk’s in particular — is that they are full of holes.

Basic flaws

Last December The Daily Telegraph reported how Homesafe failed to block access to Pornhub, the world’s third-largest pornography provider.

And things haven’t improved: an investigation by PC Pro last week found “basic flaws” that meant it was still possible to access a wide range of pornographic images through nothing more complicated than a simple Google search. And other filters were equally ropey.

TalkTalk’s filters are also hugely inconsistent in what they choose to blacklist. Social network controls bar access to Facebook, Twitter and LinkedIn, but not to Google+, StumbleUpon or reddit – including “subreddits” dedicated to sexual content. Likewise, photography site Flickr was banned, but not the “nude” section of fellow photography site 500px.

The reality is not just that TalkTalk’s provisions are weak. It’s that category filters rarely work. They’re either too broadly applied, or not broadly applied enough. They’re arbitrary. They’re easily circumvented. And they’re unable to keep up with change.

And it’s not just journalists fishing for gotchas, either. On an anecdotal basis, the same is true.

As we discovered when we looked into the content filters used by Britain’s mobile operators, these categorizations are often haphazard and extremely broadly applied. It’s the kind of patchwork approach that ends up with GigaOM blocked (because all blogs get blocked as a matter of course) and yet leaves the door open to all kinds of unpleasant content. It’s a lottery. It’s Swiss cheese.

So even if you think the mandatory filters are a good idea, the question has to be whether these filters are worthwhile. And if they are not, what is it exactly that you are advocating? An idea? An ambition? A hope that we can achieve better living through technology?

Or are you just advocating vaporware that settles the mind without ever really fixing the problem?

Nobody wants children to be subjected to inappropriate content, but a false sense of security is no security at all.

Illusionist photograph copyright Shutterstock/Olly

You’re subscribed! If you like, you can update your settings

  1. Chris Puttick Monday, July 2, 2012

    Couldn’t agree more about the efficacy of current filters; however there’s an additional issue; we should really be talking about age-appropriate content (all those who aren’t just out to block all that filthy smut!), and age-appropriateness is age-related. A catch-all filter at the connection level is not a solution to that; it either over-filters or under-filters (or both…). A proper solution handles different levels depending on the user being protected.

    We are, as they say, working on it :)

  2. So what their saying is you can have sex at 16 but can’t watch other people do it??

  3. I am all for parents being able to protect their children, especially on the internet. But is enforcing everyone as a whole really fair? Why don’t the parents use one of the literally thousands of already available filtering programs and extensions? You can choose between hundreds of free programs or extensions that work pretty decent and usually require very little effort to install or use. Heck, more anti-viruses are even starting to support site filtering in their features.

Comments have been disabled for this post