Filtered: The system around what gets censored is hard to establish

The internet censorship programme you’re not allowed to know about

The internet censorship programme you’re not allowed to know about

By Jane Fae

If you thought filtering of terror-related sites was no more than an unfulfilled gleam in the eyes of the Home Office, think again. They’ve been doing it for the best part of five years.

The announcement last autumn about how home internet access might soon be subject to such filtering is not some new initiative, but an extension of one with a significant track record.

Over the past few months, the focus has been on controls being applied to domestic internet. These are, at present, two-fold. There is the Cleanfeed, which serves to block access to a list of child abuse sites maintained by the Internet Watch Foundation (IWF), as well as, controversially, a lengthening list of sites which are alleged to encourage digital piracy. And there are the various filtering options that domestic internet service providers (ISPs) have opted for in response to government demands that they save us all from a deluge of online smut.

The set-up with regards to mobile phone filtering and public wi-fi is not dissimilar: local UK-based service providers backed by mostly non-UK filters.

There is, however, a fourth channel of access to the internet – and it is here that police and the Home Office have been intervening to prevent the public from accessing material they believe to be terror-related. This is public estate internet: in schools and universities, hospitals and government buildings. And unlike other internet channels, it's filtered by a mix of companies, including many that are UK-based.

To piece the story together, you need to put together various freedom of information requests and parliamentary responses. This is what you get at the end of it:

  • Between November 2008 and February 2011, the Labour government ran a pilot project where sites identified as hosting material in contravention of existing terror laws would be blocked to 'public estate' access.
  • Sites were flagged up by the police initially and, since 2010, by the Counter Terror Internet Referrals Unit (CTIRU), passed to the Crown Prosecution Service (CPS) for evaluation, and then fed to filtering software companies.
  • Following a pause for evaluation between February 2011 and June 2011, the project re-commenced in July 2011.
  • According to official government statements, since 2010 some 5,700 UK-based sites have been taken down, while some 1,000 overseas sites have been filtered.

The Home Office said: "The focus has been on voluntary end-user filtering. Unlike blocking, which occurs at the network level and over which users have no choice, filtering software allows end users to choose to apply filtering at the desk top level.

"The filtering list is provided to companies who supply filtering products across the public estate, including schools and libraries… There is no formal appeal process but if there is concern regarding the filtering of a specific URL containing illegal material, contact should be made with Home Office".

So much the Home Office are happy to confirm.  However, in respect of what sites are on the list and which companies are doing the filtering, it is rather more reticent.

Over the past three years, TJ McIntyre, a lecturer in law at University College Dublin has doggedly sought answers to a number of questions using freedom of information legislation.

His pursuit boils down to three questions: which sites are being filtered, which companies are doing the filtering and what liabilities would these companies incur if they filtered a site in error?

To date, he has had little success.  To most of his questions, the Home Office initially cited exemptions on grounds of law enforcement and national security. When McIntyre challenged this refusal, an official review agreed that the rejection had been over-hasty and that the Home Office had erred.  Not, that is, by failing to answer his question: but by citing the wrong reason for doing so.

We do know the name of one UK company that was involved in filtering alleged terror sites. In a parliamentary answer given back in April 2009, Vernon Coaker, then a minister at the Home Office, revealed that one of the companies carrying out such filtering was Smoothwall. However, he declined to provide a fuller list. The Home Office has since cited fears that UK companies might be subject to DOS (denial of service) and other attacks if their participation in this scheme were revealed.

It is unlikely that we will learn, any time soon, who is taking part in this 'voluntary' scheme – or what they are blocking – although this is not an issue because, again according to the Home Office: "All material filtered from the public estate is …considered to be illegal under the Terrorism Act 2006, as assessed by the Crown Prosecution Service (CPS)".

That assertion is questionable – on a couple of counts.  In July 2013 – just a few days after the CTIRU owned up to having filtered 1,000 sites, the CPS claimed: "The scheme has so far seen the review by specialist prosecutors in CTD [counter-terrorism division] of more than 50 submissions from CTIRU".

This suggests that perhaps not all sites added to the filtering list are reviewed by the CPS.

It also raises further questions over the censorship involved. When it comes to obscenity, the UK already operates a system of prior restraint through the back door: films are routinely censored, by the British Board of Film Classification, on the basis of how the CPS interpret the Obscene Publications Act. Such assertions have rarely been tested in court and, when they are, the CPS does not always win. It is therefore equally possible that the CPS' evaluation of whether a site would breach the Terror Act 2006 is also open to challenge.

There is also the matter of jurisdiction. Back when the government was arguing the need for a new law on possession of extreme porn, the Home Office was arguing very clearly that the obscene publications law was inadequate because it could not be used to block websites hosted abroad. New legislation was therefore needed and duly passed.

So despite re-assurances from Home Office and CPS, it is not unthinkable that they could make mistakes.  This is precisely what happened in 2008, when two academics – Rizwaan Sabir and Hicham Yezza – were arrested and held for six days for possession of a "terror training manual", which was widely available as an academic study tool.

If the owner of a website ever did find out they had been blocked, who is liable – and to what extent?

This question is important, given the announcement last autumn by then-crime and security minister James Brokenshire that government is preparing to require broadband companies to block extremist websites. That announcement also talked of empowering "a specialist unit" – possibly the CTIRU – to identify and report content that fell within this category.

That is problematic on several grounds. As already highlighted, the main broadband companies use filter providers that are not UK-based: the tangle of a UK police unit providing a list to overseas companies which is then fed back to UK ISP's is a liability nightmare.

The alternative and somewhat neater option is to feed such websites back through Cleanfeed. The danger of such an approach, however, is that the Cleanfeed system has been defended against its staunchest critics on the grounds it is only ever intended to be used in respect of blocking child abuse material. A number of child protection experts have expressed fears that using Cleanfeed in this way could seriously damage child protection in the UK.

It is against this background, therefore, that we now await the result of a first tier tribunal appeal by TJ McIntyre which took place last week.  This appeal is against the Home Office decision not to allow him to view Home Office material on the possible liability they and/or filtering companies would face for wrongful blocking.

So far, the Home Office have claimed that the only relevant material is contained in the licence agreement with those companies. According to McIntyre: "That makes me think that there is some form of indemnity in place whereby the Home Office promises to pay any damages that might be incurred by the companies if the Home Office wrongfully designates a site to be blocked. But of course we don't know that for sure."

The result of that appeal is expected shortly. Depending on the outcome, we might soon be finding out a lot more about the shape of internet filtering to come – and whether the Home Office is secretly indemnifying UK companies against the costs of acting unlawfully.

Jane Fae is a feminist and campaigner on issues of political and sexual freedom. You can read more of her writing here and follow her on Twitter here.