The Internet Watch Foundation (IWF) is a registered charity[3] based in Cambridgeshire, England. It states that its remit is "to minimise the availability of 'potentially criminal' Internet content, specifically images of child sexual abuse (including child pornography) hosted anywhere, and criminally obscene adult content in the UK". Content inciting racial hatred was removed from the IWF's remit after a police website was set up for the purpose in April 2011.[4] The IWF clarifies on its website that potentially criminal activity is addressed, as content can be confirmed to be criminal only by a court of law. As part of its function, the IWF says that it will "supply partners with an accurate and current URL list to enable blocking of child sexual abuse content". It has "an excellent and responsive national Hotline reporting service" for receiving reports from the public.[5] In addition to receiving referrals from the public, its agents also proactively search the peer-to-peer networks of the deep web to identify potentially illegal images. It can then ask service providers to take down the websites containing the images or to block them if they fall outside UK jurisdiction.[6]

The IWF operates in informal partnership with the police, government, public, and Internet service providers. Originally formed to police suspected child pornography online, the IWF's remit was later expanded to cover criminally obscene material.[8]

The IWF is governed by a Board of Trustees which consists of an independent chair, six non-industry representatives, and three industry representatives. The Board monitors and reviews IWF's remit, strategy, policy and budget to enable the IWF to achieve its objectives. The IWF operates from offices in Cambridge Research Park, near Cambridge.[10]

It has been criticized as an ineffective quango that does not deserve its charity status, for producing excessive numbers of false positives, for the secrecy of its proceedings, and for poor technical implementations of its policies that degraded the response time of the whole UK internet.

During 1996 the Metropolitan Police told the Internet Service Providers Association (ISPA) that the content carried by some of the newsgroups made available by them was illegal, that they considered the ISPs involved to be publishers of that material, and that they were therefore breaking the law. In August 1996, Chief Inspector Stephen French, of the Metropolitan Police Clubs & Vice Unit, sent an open letter to the ISPA, requesting that they ban access to a list of 132 newsgroups, many of which were deemed to contain pornographic images or explicit text.[11]

This list is not exhaustive and we are looking to you to monitor your newsgroups identifying and taking necessary action against those others found to contain such material. As you will be aware the publication of obscene articles is an offence. This list is only the starting point and we hope, with the co-operation and assistance of the industry and your trade organisations, to be moving quickly towards the eradication of this type of newsgroup from the Internet ... We are very anxious that all service providers should be taking positive action now, whether or not they are members of a trade association. We trust that with your co-operation and self regulation it will not be necessary for us to move to an enforcement policy.

The list was arranged so that the first section consisted of unambiguously titled paedophile newsgroups, then continued with other kinds of groups which the police wanted to restrict access to, including alt.binaries.pictures.erotica.cheerleaders and alt.binaries.pictures.erotic.centerfolds.[12]

Although this action had taken place without any prior debate in Parliament or elsewhere, the police, who appeared to be doing their best to create and not simply to enforce the law, were not acting entirely on their own initiative. Alan Travis, Home Affairs editor of the newspaper The Guardian, explained in his book Bound and Gagged that Ian Taylor, the Conservative Science and Industry Minister at the time, had underlined an explicit threat to ISPs that if they did not stop carrying the newsgroups in question, the police would act against any company that provided their users with "pornographic or violent material". Taylor went on to make it clear that there would be calls for legislation to regulate all aspects of the Internet unless service providers were seen to wholeheartedly "responsible self-regulation".[13]

We are being portrayed as a bunch of porn merchants. This is an image we need to change. Many of our members have already acted to take away the worst of the Internet. But Demon have taken every opportunity to stand alone in this regard. They do not like the concept of our organisation.

Following this, a tabloid-style exposé of Demon Internet appeared in the Observer newspaper, which alleged that Clive Feather (a director of Demon) "provides paedophiles with access to thousands of photographs of children being sexually abused".[14]

During the summer and autumn of 1996 the UK police made it known that they were planning to raid an ISP with the aim of launching a test case regarding the publication of obscene material over the Internet. The direct result of the campaign of threats and pressure was the establishment of the Internet Watch Foundation (initially known as the Safety Net Foundation) in September 1996.[15]

Facilitated by the Department of Trade & Industry (DTI), discussions were held between certain ISPs, the Metropolitan Police, the Home Office, and a body called the "Safety Net Foundation" (formed by the Dawe Charitable Trust). This resulted in the "R3 Safety Net Agreement", where "R3" referred to the triple approach of rating, reporting, and responsibility. In September 1996, this agreement was made between the ISPA, LINX, and the Safety Net Foundation, which was subsequently renamed the Internet Watch Foundation. The agreement set requirements for associated ISPs regarding identifiability and traceability of Internet users; ISPs had to cooperate with the IWF to identify providers of illegal content and facilitate easier traceability.[16]

Demon Internet was a driving force behind the IWF's creation, and one of its employees, Clive Feather, became the IWF's first chair of the Funding Board[17] and solicitor Mark Stephens the First Chair of the IWF's Policy Board. The Policy Board developed codes, guidance, operational oversight and a hotline for reporting content.

The Funding Board, made up of industry representatives and Chair of Policy Board, provided the wherewithal for the IWF's day to day activities as set down and required by the Policy Board.

After 3 years of operation, the IWF was reviewed for the DTI and the Home Office by consultants KPMG and Denton Hall. Their report was delivered in October 1999 and resulted in a number of changes being made to the role and structure of the organisation, and it was relaunched in early 2000, endorsed by the government and the DTI, which played a "facilitating role in its creation", according to a DTI spokesman.[17]

At the time, Patricia Hewitt, then Minister for E-Commerce, said: "The Internet Watch Foundation plays a vital role in combating criminal material on the Net." To counter accusations that the IWF was biased in favour of the ISPs, a new independent chairman was appointed, Roger Darlington, former head of research at the Communication Workers Union.[17]

The IWF's website offers a web-based government-endorsed method for reporting suspect online content and remains the only such operation in the United Kingdom. It acts as a Relevant Authority in accordance with the Memorandum of Understanding (MOU)[18] concerning Section 46 of the Sexual Offences Act 2003 (meaning that its analysts will not be prosecuted for looking at illegal content in the course of their duties).[19] Reports can be submitted anonymously. According to the IWF MOU "If potentially illegal content is hosted in the UK the IWF will work with the relevant service provider and British police agency to have the content ‘taken down’ and assist as necessary to have the offender(s) responsible for distributing the offending content detected." Potentially illegal content includes:

However, almost the whole of the IWF site is concerned with suspected images of child sexual abuse with little mention of other criminally obscene material, also within their remit. Images judged by the IWF to be images of child sexual abuse are blocked.

The Government claimed that they would also be handling images of adult "extreme pornography",[22] which became illegal for people in the UK to possess on 26 January 2009. The IWF includes "extreme pornography" as an example under "criminally obscene content", meaning that they will report material hosted in the UK, or uploaded by someone in the UK, but regarding blocking sites "with those categories, our remit will only go so far as to refer sites hosted in the UK to the appropriate authorities."[23]

They are funded by the European Union and the online industry. This includes Internet service providers, mobile operators and manufacturers, content service providers, telecommunications and filtering companies, search providers and the financial sector as well as blue-chip and other organisations who support the IWF for corporate social responsibility reasons.

Through their "Hotline" reporting system, the organisation helps ISPs to combat abuse of their services through a "notice and take down" service by alerting them to any potentially illegal content within their remit on their systems and simultaneously invites the police to investigate the publisher.

Previously, the IWF passed on notifications of suspected child pornography hosted on non-UK servers to the UK National Criminal Intelligence Service which in turn forwards it to Interpol or the relevant foreign police authority. It now works with the Serious Organised Crime Agency instead. The IWF does not, however, pass on notifications of other types of potentially illegal content hosted outside the UK.[26]

The IWF compiles and maintains a list of URLs for individual webpages with child sexual abuse content called the child abuse image content list or CAIC list. A whole website will only be included on the list if that whole domain is dedicated to the distribution of child sexual abuse images.[27] It says "every URL on the list depicts indecent images of children, advertisements for or links to such content, on a publicly available website. The list typically contains 500 - 800 URLs at any one time and is updated twice a day to ensure all entries are still live".[28] Offending UK URLs are not listed as they are taken down very quickly; URLs elsewhere are listed only until they are removed. The list is applied by the ISPs of 95% of commercial Internet customers in the UK. According to the IWF website, blocking applies only to potentially criminal URLs related to child sexual abuse content on publicly available websites; the distribution of images through other channels such as peer-to-peer is a matter for "our police partners", and IWF has no plans to extend the type of content included on the list.[27]

A staff of four police-trained analysts are responsible for this work,[29] and the director of the service has claimed that the analysts are capable of adding an average of 65-80 new URLs to the list each week, and act on reports received from the public rather than pursuing investigative research.[30]

Between 2004 and 2006, BT Group introduced its Cleanfeed technology which was then used by 80% of internet service providers.[31] BT spokesman Jon Carter described Cleanfeed's function as "to block access to illegal Web sites that are listed by the Internet Watch Foundation", and described it as essentially a server hosting a filter that checked requested URLs for Web sites on the IWF list, and returning an error message of "Web site not found" for positive matches.[32]

In 2006, Home Office minister Alan Campbell pledged that all ISPs would block access to child abuse websites by the end of 2007.[33] By the middle of 2006 the government reported that 90% of domestic broadband connections were either currently blocking or had plans to by the end of the year. The target for 100% coverage was set for the end of 2007,[34] however in the middle of 2008 it stood at 95%.[35] In February 2009, the Government said that it is looking at ways to cover the final 5%.[36] In an interview in March 2009, a Home Office spokesperson mistakenly thought that the IWF deleted illegal content, and didn't look at the content they rate.[33][37]

Although the IWF's blacklist causes content to be censored even if the content has not been found to be illegal by a court of law, IWF Director of Communications Sarah Robertson claimed, on 8 December 2008, that the IWF is opposed to the censorship of legal content. In the case of the IWF's blacklisting of cover art hosted on Wikipedia just a few days prior, she claimed that “The IWF found the image to be illegal”, despite the body not having any legal jurisdiction to do so.[38]

In March 2009 a Home Office spokesperson said that ISPs were being pressured to sign up to the IWF's blacklist in order to block child pornography websites and said that there was no alternative to using the IWF's blacklist. Zen Internet previously refused to use the IWF's blacklist citing "concerns over its effectiveness",[33] however it quietly joined the foundation in September 2009 while still maintaining its concerns.[39]

R v Walker, sometimes called the "Girls (Scream) Aloud Obscenity Trial", was the first prosecution for written material under Section 2(1) of the Obscene Publications Act in nearly two decades.[41] It involved the prosecution of Darryn Walker for posting a story entitled "Girls (Scream) Aloud" on an internet erotic story site in 2008. The story was a fictional written account describing the kidnap, rape and murder of pop group Girls Aloud.[42] It was reported to the IWF who passed the information on to Scotland Yard’s Obscene Publications Unit. During the trial the prosecution claimed that the story could be "easily accessed" by young fans of Girls Aloud. However, the defence demonstrated that it could only be located by those specifically searching for such material. As a result the case was abandoned and the defendant cleared of all charges.[43][44]

[...] the image in question is potentially in breach of the Protection of Children Act 1978. However, the IWF Board has today (9 December 2008) considered these findings and the contextual issues involved in this specific case and, in light of the length of time the image has existed and its wide availability, the decision has been taken to remove this webpage from our list.

”

Additionally, many UK Internet users were unable to edit Wikipedia pages unless registered and logged in with Wikipedia.[50] This is reported to be due to the single blacklisted article causing all Wikipedia traffic from ISPs using the system to be routed through a transparent proxy server. Wikipedia distinguishes unlogged-in users from each other by their IP address, so interpreted all unlogged-in users from a particular ISP as a single user editing massively from the proxy address, which triggered Wikipedia's anti-abuse mechanism, blocking them.[51]

On 14 January 2009 some UK users reported that all of the 85 billion pages of the Internet Archive (Wayback Machine) had been blocked, although the IWF's policy is to block only individual offending webpages, not whole domains.[52] According to IWF chief executive Peter Robbins this was due to a "technical hitch".[53] Because the Internet Archive's web site contained URLs on the IWF's blacklist, requests sent there from Demon Internet carried a particular header, which clashed with the Internet Archive's internal mechanism to convert web links when serving archived versions of web pages.[40] The actual blocked URL which had caused the incident never became publicly known.[40]

Many ISPs implement IWF filtering by using a transparent proxy server of their own, unconnected with IWF.[54] Quoting Plusnet "If the IP address matches that of a server that's used to host one of the websites on the IWF list then your request is diverted to a proxy server."[54] The hosting server itself is not blacklisted, the problem is due to requesting a page from a server which also hosts a listed page. The IWF lists the Internet companies which "have voluntarily committed to block access to child sexual abuse web pages".[55] These companies may use transparent proxies or other techniques.

Using a transparent proxy has the unintended side effect, quite independent of IWF filtering, of appearing to websites connected to as originating from the proxy IP instead of the user's real IP. Some sites detect the user's IP and adjust their behaviour accordingly.[51] For example, if trying to download files from a file distribution website which restricts free-of-charge usage by enforcing a delay of typically 30 minutes between downloads, any attempt to download is interpreted as originating from the ISP's proxy rather than the user. The consequence is that if any user of that ISP has downloaded any file from the site in the last half-hour (which is very likely for a large ISP), the download is not allowed.[56] This is an unintended consequence of ISP's use of proxy servers, not IWF filtering. File sharing sites distribute files of all types; for example Linux distribution files, which are very large.[57] The use of proxy servers is also reported to have caused the problem with editing Wikipedia (but not the blocking of the actual offending web page) reported above.

IWF filtering has been criticised as interfering with legitimate Internet activity while being ineffective against anyone intending to access objectionable content. One carefully argued discussion, while opposing such things as child pornography and terrorism, points out that filtering has side effects, as discussed in this section, and would not stop access to material such as images of child sexual abuse as it would not stop email, ftp, https, p2p, usenet, irc, or many other ways to access the same content. As there are simple encryption systems, it never can stop it - at best it just drives it underground and harder to assess and track.[58]

In February 2009 a Yorkshire-based software developer lodged a formal complaint regarding the IWF status as a charity with the Charity Commission, in which he pointed out that "regulating the worst of the internet" was "not really a charitable purpose", and that the IWF existed mainly to serve the interests of ISPs subscribing to it rather than the public. An IWF spokesperson said that the IWF had attained charitable status in 2004 "in order to subject itself to more robust governance requirements and the higher levels of scrutiny and accountability which charity law, alongside company law, brings with it".[59] The IWF is listed by fakecharities.org, "a directory of those so-called charities that receive substantial funding from either the UK or EU governments".[60] It has also been termed a quango by critics, implying poor management and lack of accountability.[61]

Following the IWF's blacklisting of the Wikipedia article, the organisation's operating habits came under scrutiny. J.R. Raphael of PC World stated that the incident had raised serious free-speech issues, and that it was alarming that one non-governmental organisation was ultimately acting as the "morality police" for about 95% of UK's Internet users.[62] Frank Fisher of The Guardian criticized the IWF for secretiveness and lack of legal authority, among other things, and noted that the blacklist could contain anything and that the visitor of a blocked address may not know if their browsing is being censored.[63]

The government believes that a self-regulatory system is the best solution, and the Metropolitan Police also believe that working with ISPs, rather than trying to force them via legislation, is the way forward.[17] The IWF has a list of URLs considered to host objectionable material (distinct from the actual, confidential, blacklist of pages[clarification needed]) which is available to ISPs,[64] but ISPs are not obliged to subscribe to it.

As a "self-appointed, self-regulated internet watchdog, which views user-submitted content and compiles a list of websites that it deems to contain illegal images" there have been questions raised regarding the legality of their viewing content that would normally constitute a criminal offence.[37]

The IWF has been criticized for blacklisting legal content and for not telling websites that they are being blocked.[65] In these circumstances the owner(s) of the blocked webpage might not even know they have offending content on their site, which means that the content would still be readily available to anyone outside of the UK.

The blacklisting of sites may be concealed by a generic HTTP 404 "page not found" message rather than an explanation that the content has been censored. The exact method of censorship is determined by the implementing ISP; BT, for example, return HTTP 404 pages, whereas Demon return a message stating that the page is censored, and why.[66]

In October 2014 users on Sky Broadband reported very slow and intermittent performance of image host Imgur.[68] Clicking on an image would typically result in the site appearing to be down. Accessing via HTTPS causes images to load normally because it bypasses the proxy used on sites with blacklisted content.

^Plusnet say (with referenced quote in body of article) "If the IP address matches that of a server that's used to host one of the websites on the IWF list then your request is diverted to a proxy server.". This is only possible if they have a list, not of pages, but of sites being watched