2014-04-06



Let’s know how the government and internet companies have started filtering portion of the World Wide Web away from you.

You've probably heard of the 'Great Firewall of China', officially known as the Golden Shield Project. It's a censorship program that prevents residents of China from accessing various websites its government deems inappropriate or harmful. The thing is, that's a hell of a long list of sites: Wikipedia, Facebook, YouTube, Twitter, and Google Drive are all on it, and the strictness of the filters is often pointed to as an example of the actions of an overbearing government.

Imagine not being able to access everyday sites like Facebook, just because the government thought you shouldn't. Sounds incredible, doesn't it surely something like that could never happen here? Well, while it may be the case that censorship on that scale would never be allowed to pass in the UK, something insidious is happening to our internet access: our government and various businesses are taking it upon themselves to decide what grown adults ought to be able to access online.

The argument for censoring content tends to start with pornography, because it's an easy target. Most people agree that young children ought to be kept away from hardcore porn, and most people also aren't overly keen to argue in favour of free access to it, because it's somewhat embarrassing, isn't it? No-one wants to be the guy complaining about having his masturbatory material taken away. The problem is, pornography is legendarily hard to define, and so in trying to block 'harmful' content, filters inevitably start censoring innocent or useful information into the bargain. Things like sex education sites or websites for domestic violence shelters; the kind of stuff it might be actively harmful to restrict people's access to.

Then there's the fact that once filters are in place, and once the required lists of blacklisted websites or search terms have been compiled, it's very easy for a few extra items to make their way onto that list in a piecemeal way. Even the most well- intended internet filter can quickly become a tool of oppression and censorship in the wrong (or even slightly misguided) hands. If the government or Google can stop you seeing one kind of website, they can potentially stop you from seeing a hundred, and you'd have no way of even knowing about it.

Its scary stuff, and a difficult topic to discuss. Exaggerating the scale of the issue tends to just make rights campaigners look a bit nuts, which makes them easier to dismiss, but it's almost important to realise how big a problem compulsory internet filters could become. Let's swallow our embarrassment, and take a look at what's going on...



The government fumbled filter

For as long as they've been in power, the Conservatives seem to have been trying to institute some kind of internet filtering system or other. At first, it seemed laughable; anyone who knew their way around a computer could see why filters wouldn't work properly, for half a dozen different reasons. However, the idea wouldn't go away.

In April 2012, Prime Minister David Cameron supported a proposal that would see all pornography blocked at an ISP- level, with users needing to opt-in to access adult material but after a public consultation, the idea seemed unworkable, because only about 35% of the public liked the idea. Then in December 2012, Cameron announced that he would instead be supporting a plan to bring in sophisticated web filters parents could use on their computers to protect their own kids. The idea was that child-locking software would be per-installed on all new computers sold in the UK, for parents to configure for themselves. Again, tech-savvy types were sceptical, because although that sounded like a good enough idea in theory, it wasn't clear how the government thought they were going to implement the system.

So ISP-Level filters were brought back into the picture. In July last year, Cameron again declared that service providers would have to take responsibility for filtering the content their users could access. A voluntary agreement was set up at first, with the idea that legislation to make filters compulsory would follow. So ISPs started bringing in filtering – BT Virgin, Sky and Talk Talk all have instituted government-endorsed porn filters.

Investigating the Issue

These filters are supposed to block hardcore pornography, but to no-one's surprise - and as an investigation by News night showed - they don't work. Many porn sites are accessible even with the filters turned on and, more worrying, many non-pornographic sites, including sex education sites, rape crisis centres, and various blogs seemingly picked at random have been blocked. Slow clap all round.

The government's next move, following that high profile investigation, is to create a white list of websites that ISPs shouldn't block with their porn filters. That list will largely consist of charity and education websites, those set up to provide kids with sensible information about sex and drugs, and those set up to help vulnerable people.

It'll also, apparently, mean that other news and information sites that were 'inadvertently' caught up in the block will be unblocked. This is work that probably ought to have been done before the filters were launched, but maybe it took a BBC investigation to point out that the filters aren't working. If that's the case, let's hope that the BBC keeps an eye on what's going on with these filtering schemes, to avoid any further problems with sites accidentally getting added to a blacklist. It'd be one thing if all of this could be blamed squarely on the Tories - there's an election coming up next year, after all - but the Labor Party has already made it clear that it supports the idea of ISP-Level filtering too, and even the Liberal Democrats are reluctant to speak up against it.

For now, it's still possible to opt out of applying the filters if you're uncomfortable with the idea of the government deciding what you can see online, but that's another thing we'll just have to hope remains the case. At the moment, the legislation doesn't say anything about ensuring an opt-out function as a feature of the filtering. Scary isn't it?

Tumblr's clean-up Job

What's scarier is that the government isn't the only party interested in limiting what you can see online. Big companies are getting involved in the content-filtering game too. When Yahoo! bought Tumblr last year, a lot of the site's users were worried that it'd mean things would change. After all, the site is basically made up of copyright infringing images and user infighting, which a big corporation might be a little wary of paying for. While those users were assured that nothing would change, and that their beloved site would carry on just as it had before, inevitably it didn't.

In July last year, Tumblr introduced a default 'Safe Mode' for users which filtered out material classified as 'not safe for work' (NSFW).Users were asked to specify whether their blogs were NSFW and Tumblr staff could also classify blogs that way, in case users didn't feel like getting rid of their entire audience. Another new classification, Adult, would mean that all posts from a blog would be blocked from showing up in search results or tag pages; essentially, their content would only be visible to their followers. Tumblr's mobile app searches were even more locked down, and a range of search terms were blocked completely. Any searches using such terms would turn up no results. Helpfully the list included things like 'gay' and 'depression'.

As you'd imagine, that didn't go down well with Tumblr users. A blog post from Tumblr creator David Karp (http://staff.tumblr.com/post/5590655...of-you-who-are) attempted to clarify that the settings were being worked on, that Tumblr didn't intend to seem so homophobic, and that users could change their settings in order to see adult material, but the new settings were difficult to find, and the apps still didn't let you search for anything remotely dodgy.

Tumblr users voted with their feet. According to Matt Mullenweg, founder of WordPress the number of people porting their blogs from Tumblr to WordPress rose from around 400 per hour to more than 72,000 per hour after the Yahoo! deal that’s a staggering amount of unhappy customers. In the wake of this, Tumblr's Safe Mode option vanished from the web platform, and a new on/off switch has since been added to the mobile apps allowing users to easily choose if they want to see NSFW material. A victory for free speech, then 7 Maybe, for the time being, though it'll be interesting to see what other kinds of filters Tumblr brings in in future.

Google's Safer Searches

If you're not a Tumblr user, maybe you're not that bothered which search terms it does or doesn't let you use. That's fair enough, but how about Google, which you're far more likely to be using for a range of different purposes? It is also playing with ways to filter its search results, potentially blocking all sorts of content from showing up when you look for it. Google has had a Safe Search option for a long time, which was easily accessible from the image search page and did a reasonably good job of hiding any explicit results from the list.

However, in December, it changed things: most explicit content was now blocked completely, even with Safe Search turned off, and if the search engine spotted users searching for porn-related key words for the first time, it popped up a dialogue window asking if they were sure. When users and tech sites started paying attention to Google's weird filtering, it issued a statement explaining what was going on, to wit "We are not censoring any adult content and want to show users exactly what they are looking for but we aim not to show sexually-explicit results unless a user is specifically searching for them. We use algorithms to select the most relevant results for a given query. If you're looking for adult content, you can find it without having to change the default setting you just may need to be more explicit in your query if your search terms are potentially ambiguous."

That sounds perfectly reasonable, and it seems the system is being tweaked all the time; while, in December, typing in the name of a specific porn star (it was for research- don't ask) turned up no results whatsoever (even for her non-porn projects) carrying out that same search now does turn up some results. This suggests Google is refining the effect of its filters, and adjusting them where they're blocking too much. It's still worrying, though. How many of us rely on Google to find whatever information we're looking for the purposes of work as well as 'play'? It already uses complex algorithms to find the most relevant content for the terms we're searching for, so if Google starts adding in strange, morally-motivated, filters to refine its search results further, it can pretty much decide to hide some stuff completely.

It's not like Google is likely to be doing any of this for the fun of it. The UK government has repeatedly threatened to take Google to task over the kinds of results its search engine returns, in order to make it more responsible for the content found through its site so Google has good reason to find a way to filter out objectionable content. It's just that it shouldn't have to be the one to decide. Google, after all, didn't get elected to govern this country.

Fight for Your Rights

The good news is that, for every step the UK government or a company has taken towards censorship, users have made them step back again. That doesn't mean it's not something we ought to be paying attention to, though. If anything, it means we ought to be even more vigilant when anyone starts talking about bringing in new filters to 'protect' us.

If what's kept the internet free so far is people fighting for it, we need to keep fighting. Again, it doesn't do any good to resort to hyperbole here, but freedom of information is important, and letting anyone - whether it's the government or a search engine - take responsibility for the information grown adults are allowed to provide to one another is a dangerous, scary precedent to set.

Attached Images

  

Show more