To ask Her Majesty ‘s Government what will be the commencement date for their plans to ensure that age-verification to prevent children accessing pornographic websites is implemented by the British Board of Film Classification .

Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

My Lords, we are now in the final stages of the process, and we have laid the BBFC ‘s draft guidance and the Online Pornography (Commercial Basis) Regulations before Parliament for approval. We will ensure that there is a sufficient period following parliamentary approval for the public and the industry to prepare for age verification. Once parliamentary proceedings have concluded, we will set a date by which commercial pornography websites will need to be compliant, following an implementation window. We expect that this date will be early in the new year.

Baroness Benjamin

I thank the Minister for his Answer. I cannot wait for that date to happen, but does he share my disgust and horror that social media companies such as Twitter state that their minimum age for membership is 13 yet make no attempt to restrict some of the most gross forms of pornography being exchanged via their platforms? Unfortunately, the Digital Economy Act does not affect these companies because they are not predominantly commercial porn publishers. Does he agree that the BBFC needs to develop mechanisms to evaluate the effectiveness of the legislation for restricting children’s access to pornography via social media sites and put a stop to this unacceptable behaviour?

Lord Ashton of Hyde

My Lords, I agree that there are areas of concern on social media sites. As the noble Baroness rightly says, they are not covered by the Digital Economy Act . We had many hours of discussion about that in this House. However, she will be aware that we are producing an online harms White Paper in the winter in which some of these issues will be considered. If necessary, legislation will be brought forward to address these, and not only these but other harms too. I agree that the BBFC should find out about the effectiveness of the limited amount that age verification can do; it will commission research on that. Also, the Digital Economy Act itself made sure that the Secretary of State must review its effectiveness within 12 to 18 months.

My Lords, once again I find this issue raising a dynamic that we became familiar with in the only too recent past. The Government are to be congratulated on getting the Act on to the statute book and, indeed, on taking measures to identify a regulator as well as to indicate that secondary legislation will be brought forward to implement a number of the provisions of the Act. My worry is that, under one section of the Digital Economy Act , financial penalties can be imposed on those who infringe this need; the Government seem to have decided not to bring that provision into force at this time. I believe I can anticipate the Minister ‘s answer but–in view of the little drama we had last week over fixed-odds betting machines–we would not want the Government, having won our applause in this way, to slip back into putting things off or modifying things away from the position that we had all agreed we wanted.

Lord Ashton of Hyde

My Lords, I completely understand where the noble Lord is coming from but what he said is not quite right. The Digital Economy Act included a power that the Government could bring enforcement with financial penalties through a regulator. However, they decided–and this House decided–not to use that for the time being. For the moment, the regulator will act in a different way. But later on, if necessary, the Secretary of State could exercise that power. On timing and FOBTs, we thought carefully–as noble Lords can imagine–before we said that we expect the date will be early in the new year,

Lord Addington Liberal Democrat

My Lords, does the Minister agree that good health and sex education might be a way to counter some of the damaging effects? Can the Government make sure that is in place as soon as possible, so that this strange fantasy world is made slightly more real?

Lord Ashton of Hyde

The noble Lord is of course right that age verification itself is not the only answer. It does not cover every possibility of getting on to a pornography site. However, it is the first attempt of its kind in the world, which is why not only we but many other countries are looking at it. I agree that sex education in schools is very important and I believe it is being brought into the national curriculum already.

The Earl of Erroll Crossbench

Why is there so much wriggle room in section 6 of the guidance from the DCMS to the AV regulator? The ISP blocking probably will not work, because everyone will just get out of it. If we bring this into disrepute then the good guys, who would like to comply, probably will not; they will not be able to do so economically. All that was covered in British Standard PAS 1296, which was developed over three years. It seems to have been totally ignored by the DCMS. You have spent an awful lot of time getting there, but you have not got there.

Lord Ashton of Hyde

One of the reasons this has taken so long is that it is complicated. We in the DCMS , and many others, not least in this House, have spent a long time discussing the best way of achieving this. I am not immediately familiar with exactly what section 6 says, but when the statutory instrument comes before this House–it is an affirmative one to be discussed–I will have the answer ready for the noble Earl.

Lord West of Spithead Labour

My Lords, does the Minister not agree that the possession of a biometric card by the population would make the implementation of things such as this very much easier?

Lord Ashton of Hyde

In some ways it would, but there are problems with people who either do not want to or cannot have biometric cards.

Following the conclusion of their consultation period, the BBFC have issued new age verification guidance that has been laid before Parliament. It is unclear why, if the government now recognises that privacy protections like this are needed, the government would also leave the requirements as voluntary. Summary

The new code has some important improvements, notably the introduction of a voluntary scheme for privacy, close to or based on a GDPR Code of Conduct. This is a good idea, but should not be put in place as a voluntary arrangement. Companies may not want the attention of a regulator, or may simply wish to apply lower or different standards, and ignore it. It is unclear why, if the government now recognises that privacy protections like this are needed, the government would also leave the requirements as voluntary.

We are also concerned that the voluntary scheme may not be up and running before the AV requirement is put in place. Given that 25 million UK adults are expected to sign up to these products within a few months of its launch, this would be very unhelpful.

Parliament should now:

Ask the government why the privacy scheme is to be voluntary, if the risks of relying on general data protection law are now recognised;

Ask for assurance from BBFC that the voluntary scheme will cover the all of the major operators; and

Ask for assurance from BBFC and DCMS that the voluntary privacy scheme will be up and running before obliging operators to put Age Verification measures in place.

Lack of Enforceability of Guidance The Digital Economy Act does not allow the BBFC to judge age verification tools by any standard other than whether or not they sufficiently verify age. We asked that the BBFC persuade the DCMS that statutory requirements for privacy and security were required for age verification tools.

The BBFC have clearly acknowledged privacy and security concerns with age verification in their response. However, the BBFC indicate in their response that they have been working with the ICO and DCMS to create a voluntary certification scheme for age verification providers:

“This voluntary certification scheme will mean that age-verification providers may choose to be independently audited by a third party and then certified by the Age-verification Regulator. The third party’s audit will include an assessment of an age-verification solution’s compliance with strict privacy and data security requirements.”

The lack of a requirement for additional and specific privacy regulation in the Digital Economy Act is the cause for this voluntary approach.

While a voluntary scheme above is likely to be of some assistance in promoting better standards among age verification providers, the “strict privacy and data security requirements” which the voluntary scheme mentions are not a statutory requirement, leaving some consumers at greater risk than others.

Sensitive Personal Data The data handled by age verification systems is sensitive personal data. Age verification services must directly identify users in order to accurately verify age. Users will be viewing pornographic content, and the data about what specific content a user views is highly personal and sensitive. This has potentially disastrous consequences for individuals and families if the data is lost, leaked, or stolen.

Following a hack affecting Ashley Madison — a dating website for extramarital affairs — a number of the site’s users were driven to suicide as a result of the public exposure of their sexual activities and interests.

For the purposes of GDPR, data handled by age verification systems falls under the criteria for sensitive personal data, as it amounts to “data concerning a natural person’s sex life or sexual orientation”.

Scheduling Concerns It is of critical importance that any accreditation scheme for age verification providers, or GDPR code of conduct if one is established, is in place and functional before enforcement of the age verification provisions in the Digital Economy Act commences. All of the major providers who are expected to dominate the age verification market should undergo their audit under the scheme before consumers will be expected to use the tool. This is especially true when considering the fact that MindGeek have indicated their expectation that 20-25 million UK adults will sign up to their tool within the first few months of operation. A voluntary accreditation scheme that begins enforcement after all these people have already signed up would be unhelpful.

Consumers should be empowered to make informed decisions about the age verification tools that they choose from the very first day of enforcement. No delays are acceptable if users are expected to rely upon the scheme to inform themselves about the safety of their data. If this cannot be achieved prior to the start of expected enforcement of the DE Act’s provisions, then the planned date for enforcement should be moved back to allow for the accreditation to be completed.

Issues with Lack of Consumer Choice It is of vital importance that consumers, if they must verify their age, are given a choice of age verification providers when visiting a site. This enables users to choose which provider they trust with their highly sensitive age verification data and prevents one actor from dominating the market and thereby promoting detrimental practices with data. The BBFC also acknowledge the importance of this in their guidance, noting in 3.8:

“Although not a requirement under section 14(1) the BBFC recommends that online commercial pornography services offer a choice of age-verification methods for the end-user”.

This does not go far enough to acknowledge the potential issues that may arise in a fragmented market where pornographic sites are free to offer only a single tool if they desire.

Without a statutory requirement for sites to offer all appropriate and available tools for age verification and log in purposes, it is likely that a market will be established in which one or two tools dominate. Smaller sites will then be forced to adopt these dominant tools as well, to avoid friction with consumers who would otherwise be required to sign up to a new provider.

This kind of market for age verification tools will provide little room for a smaller provider with a greater commitment to privacy or security to survive and robs users of the ability to choose who they trust with their data.

We already called for it to be made a statutory requirement that pornographic sites must offer a choice of providers to consumers who must age verify, however this suggestion has not been taken up.

We note that the BBFC has been working with the ICO and DCMS to produce a voluntary code of conduct. Perhaps a potential alternative solution would be to ensure that a site is only considered compliant if it offers users a number of tools which has been accredited under the additional privacy and security requirements of the voluntary scheme.

GDPR Codes of Conduct A GDPR “Code of Conduct” is a mechanism for providing guidelines to organisations who process data in particular ways, and allows them to demonstrate compliance with the requirements of the GDPR.

A code of conduct is voluntary, but compliance is continually monitored by an appropriate body who are accredited by a supervisory authority. In this case, the “accredited body” would likely be the BBFC, and the “supervisory authority” would be the ICO. The code of conduct allows for certifications, seals and marks which indicate clearly to consumers that a service or product complies with the code.

Codes of conduct are expected to provide more specific guidance on exactly how data may be processed or stored. In the case of age verification data, the code could contain stipulations on:

Appropriate pseudonymisation of stored data;

Data and metadata retention periods;

Data minimisation recommendations;

Appropriate security measures for data storage;

Security breach notification procedures;

Re-use of data for other purposes.

The BBFC’s proposed “voluntary standard” regime appears to be similar to a GDPR code of conduct, though it remains to be seen how specific the stipulations in the BBFC’s standard are. A code of conduct would also involve being entered into the ICO’s public register of UK approved codes of conduct, and the EPDB’s public register for all codes of conduct in the EU.

Similarly, GDPR Recital 99 notes that “relevant stakeholders, including data subjects” should be consulted during the drafting period of a code of conduct – a requirement which is not in place for the BBFC’s voluntary scheme.

It is possible that the BBFC have opted to create this voluntary scheme for age verification providers rather than use a code of conduct, because they felt they may not meet the GDPR requirements to be considered as an appropriate body to monitor compliance. Compliance must be monitored by a body who has demonstrated:

Their expertise in relation to the subject-matter;

They have established procedures to assess the ability of data processors to apply the code of conduct;

They have the ability to deal with complaints about infringements; and

Their tasks do not amount to a conflict of interest.

Parties Involved in the Code of Conduct Process As noted by GDPR Recital 99, a consultation should be a public process which involves stakeholders and data subjects, and their responses should be taken into account during the drafting period:

“When drawing up a code of conduct, or when amending or extending such a code, associations and other bodies representing categories of controllers or processors should consult relevant stakeholders, including data subjects where feasible , and have regard to submissions received and views expressed in response to such consultations.”

The code of conduct must be approved by a relevant supervisory authority (in this case the ICO).

An accredited body (BBFC) that establishes a code of conduct and monitors compliance is able to establish their own structures and procedures under GDPR Article 41 to handle complaints regarding infringements of the code, or regarding the way it has been implemented. BBFC would be liable for failures to regulate the code properly under Article 41(4), [1] however DCMS appear to have accepted the principle that the government would need to protect BBFC from such liabilities. [2]

GDPR Codes of Conduct and Risk Management Below is a table of risks created by age verification which we identified during the consultation process. For each risk, we have considered whether a GDPR code of conduct may help to mitigate the effects of it.

Risk

CoC Appropriate?

Details

User identity may be correlated with viewed content.

Partially

This risk can never be entirely mitigated if AV is to go ahead, but a CoC could contain very strict restrictions on what identifying data could be stored after a successful age verification.

Identity may be associated to an IP address, location or device.

No

It would be very difficult for a CoC to mitigate this risk as the only safe mitigation would be not to collect user identity information.

An age verification provider could track users across all the websites it’s tool is offered on.

Yes

Strict rules could be put in place about what data an age verification provider may store, and what data it is forbidden from storing.

Users may be incentivised to consent to further processing of their data in exchange for rewards (content, discounts etc.)

Yes

Age verification tools could be expressly forbidden from offering anything in exchange for user consent.

Leaked data creates major risks for identified individuals and cannot be revoked or adequately compensated for.

Partially

A CoC can never fully mitigate this risk if any data is being collected, but it could contain strict prohibitions on storing certain information and specify retention periods after which data must be destroyed, which may mitigate the impacts of a data breach.

Risks to the user of access via shared computers if viewing history is stored alongside age verification data.

Yes

A CoC could specify that any accounts for pornographic websites which may track viewed content must be strictly separate and not in any visible way linked to a user’s age verification account or data that confirms their identity.

Age verification systems are likely to trade off convenience for security. (No 2FA, auto-login, etc.)

Yes

A CoC could stipulate that login cookies that “remember” a returning user must only persist for a short time period, and should recommend or enforce two-factor authentication.

The need to re-login to age verification services to access pornography in “private browsing” mode may lead people to avoid using this feature and generate much more data which is then stored.

No

A CoC cannot fix this issue. Private browsing by nature will not store any login cookies or other objects and will require the user to re-authenticate with age verification providers every time they wish to view adult content.

Users may turn to alternative tools to avoid age verification, which carry their own security risks. (Especially “free” VPN services or peer-to-peer networks).

No

Many UK adults, although over 18, will be uncomfortable with the need to submit identity documents to verify their age and will seek alternative means to access content. It is unlikely that many of these individuals will be persuaded by an accreditation under a GDPR code.

Age verification login details may be traded and shared among teenagers or younger children, which could lead to bullying or “outing” if such details are linked to viewed content.

Yes

Strict rules could be put in place about what data an age verification provider may store, and what data it is forbidden from storing.

Child abusers could use their access to age verified content as an adult as leverage to create and exploit relationships with children and teenagers seeking access to such content (grooming).

No

This risk will exist as long as age verification is providing a successful barrier to accessing such content for under-18s who wish to do so.

The sensitivity of content dealt with by age verification services means that users who fall victim to phishing scams or fraud have a lower propensity to report it to the relevant authorities.

Partially

A CoC or education campaign may help consumers identify trustworthy services, but it can not fix the core issue, which is that users are being socialised into it being “normal” to input their identity details into websites in exchange for pornography. Phishing scams resulting from age verification will appear and will be common, and the sensitivity of the content involved is a disincentive to reporting it.

The use of credit cards as an age verification mechanism creates an opportunity for fraudulent sites to engage in credit card theft.

No

Phishing and fraud will be common. A code of conduct which lists compliant sites and tools externally on the ICO website may be useful, but a phishing site may simply pretend to be another (compliant) tool, or rely on the fact that users are unlikely to check with the ICO every time they wish to view pornographic content.

The rush to get age verification tools to market means they may take significant shortcuts when it comes to privacy and security.

Yes

A CoC could assist in solving this issue if tools are given time to be assessed for compliance before the age verification regime commences .

A single age verification provider may come to dominate the market, leaving users little choice but to accept whatever terms the provider offers.

Partially

Practically, a CoC could mitigate some of the effects of an age verification tool monopoly if the dominant tool is accredited under the Code. However, this relies on users being empowered to demand compliance with a CoC, and it is possible that users will instead be left with a “take it or leave it” situation where the dominant tool is not CoC accredited.

Allowing pornography “monopolies” such as MindGeek to operate age verification tools is a conflict of interest.

Partially

As the BBFC note in their consultation response, it would not be reasonable to prohibit a pornographic content provider from running an age verification service as it would prevent any site from running their own tool. However, under a CoC it is possible that a degree of separation could be enforced that requires an age verification tools to adhere to strict rules about the use of data, which could mitigate the effects of a large pornographic content provider attempting to collect as much user data as possible for their own business purposes.

[1] “Infringements of the following provisions shall, in accordance with paragraph 2, be subject to administrative fines up to 10 000 000 EUR, or in the case of an undertaking, up to 2 % of the total worldwide annual turnover of the preceding financial year, whichever is higher: the obligations of the monitoring body pursuant to Article 41(4).”

[2] “contingent liability will provide indemnity to the British Board of Film Classification (BBFC) against legal proceedings brought against the BBFC in its role as the age verification regulator for online pornography.”

The Government has announced the organisations that will sit on the Executive Board of a new national body to tackle online harms in the UK. The UK Council for Internet Safety (UKCIS) is the successor to the UK Council for Child Internet Safety (UKCCIS), with an expanded scope to improve online safety for everyone in the UK.

The Executive Board brings together expertise from a range of organisations in the tech industry, civil society and public sector.

Margot James, Minister for Digital and the Creative Industries said:

Only through collaborative action will the UK be the safest place to be online. By bringing together a wealth of expertise from a wide range of fields, UKCIS can be an example to the world on how we can work together to face the challenges of the digital revolution in an effective and responsible way.

UKCIS has been established to allow these organisations to collaborate and coordinate a UK-wide approach to online safety.

It will contribute to the Government’s commitment to make the UK the safest place in the world to be online, and will help to inform the development of the forthcoming Online Harms White Paper.

Priority areas of focus will include online harms experienced by children such as cyberbullying and sexual exploitation; radicalisation and extremism; violence against women and girls; hate crime and hate speech; and forms of discrimination against groups protected under the Equality Act, for example on the basis of disability or race.

CEO of Internet Matters Carolyn Bunting said:

We are delighted to sit on the Executive Board of UKCIS where we are able to represent parents needs in keeping their children safe online.

Online safety demands a collaborative approach and by bringing industry together we hope we can bring about real change and help everyone benefit from the opportunities the digital world has to offer.

The UKCIS Executive Board is jointly chaired by Margot James, Minister for Digital and the Creative Industries (Department for Digital, Culture, Media and Sport); Victoria Atkins, Minister for Crime, Safeguarding and Vulnerability (Home Office); and Nadeem Zahawi, Minister for Children and Families (Department for Education). It also includes representatives from the Devolved Administrations of Scotland, Wales and Northern Ireland. Board membership will be kept under periodic review, to ensure it represents the full range of online harms that the government seeks to tackle.

As far as I can see if a porn website verifies your age with personal data, it will probably also require you tick tick a consent box with a hol load of small print that nobody ever reads. Now if that small print lets it forward all personal data, coupled with porn viewing data, to the Kremlin’s dirty tricks and blackmail department then that’s ok with the the Government’s age verification law. So for sure some porn viewers are going to get burnt because of what the government has legislated and because of what the BBFC have implemented.So perhaps it is not surprising that the BBFC has asked the government to pick up the tab should the BBFC be sued by people harmed by their decisions. After all it was the government who set up the unsafe environment, not the BBFC.

Margot James The Minister of State, Department for Culture, Media and Sport announced in Parliament:

I am today laying a Departmental Minute to advise that the Department for Digital, Culture, Media and Sport (DCMS) has received approval from Her Majesty’s Treasury (HMT) to recognise a new Contingent Liability which will come into effect when age verification powers under Part 3 of the Digital Economy Act 2017 enter force.

The contingent liability will provide indemnity to the British Board of Film Classification (BBFC) against legal proceedings brought against the BBFC in its role as the age verification regulator for online pornography.

As you know, the Digital Economy Act introduces the requirement for commercial providers of online pornography to have robust age verification controls to protect children and young people under 18 from exposure to online pornography. As the designated age verification regulator, the BBFC will have extensive powers to take enforcement action against non-compliant sites. The BBFC can issue civil proceedings, give notice to payment-service providers or ancillary service providers, or direct internet service providers to block access to websites where a provider of online pornography remains non-compliant.

The BBFC expects a high level of voluntary compliance by providers of online pornography. To encourage compliance, the BBFC has engaged with industry, charities and undertaken a public consultation on its regulatory approach. Furthermore, the BBFC will ensure that it takes a proportionate approach to enforcement and will maintain arrangements for an appeals process to be overseen by an independent appeals body. This will help reduce the risk of potential legal action against the BBFC.

However, despite the effective work with industry, charities and the public to promote and encourage compliance, this is a new law and there nevertheless remains a risk that the BBFC will be exposed to legal challenge on the basis of decisions taken as the age verification regulator or on grounds of principle from those opposed to the policy.

As this is a new policy, it is not possible to quantify accurately the value of such risks. The Government estimates a realistic risk range to be between 2£1m – 2£10m in the first year, based on likely number and scale of legal challenges. The BBFC investigated options to procure commercial insurance but failed to do so given difficulties in accurately determining the size of potential risks. The Government therefore will ensure that the BBFC is protected against any legal action brought against the BBFC as a result of carrying out duties as the age verification regulator.

The Contingent Liability is required to be in place for the duration of the period the BBFC remain the age verification regulator. However, we expect the likelihood of the Contingent Liability being called upon to diminish over time as the regime settles in and relevant industries become accustomed to it. If the liability is called upon, provision for any payment will be sought through the normal Supply procedure.

It is usual to allow a period of 14 Sitting Days prior to accepting a Contingent Liability, to provide Members of Parliament an opportunity to raise any objections.

Sky TV has decided to partner with the US media rating service, Common Sense Media to introduce a detailed rating system that will help parents make smarter choices about what their children watch on Sky. The new service will launch in the UK in 2019.Since its founding in 2003, Common Sense has built the largest library of independent age-based reviews for everything kids watch, play, read and learn. The service, which will be available on Sky Q, will include in-depth information on the prevalence of specific types of content. This includes the educational value of the show, positive messages, use of positive role models, bad language, violence, sex and drink and drugs. Each is rated on a scale of one to five depending on how applicable it is to each show.

Jeremy Darroch, Group Chief Executive, Sky, said:

As a parent I know how reassuring it is that the Sky platform offers a safe, highly-regulated, family-friendly environment 203 but we know we can always do more.? Our partnership with Common Sense will help give parents greater peace of mind, helping them make smarter viewing choices for their children.

Later this year Sky Kids Safe Mode will launch on Sky Q, helping parents hand pick and ring-fence the content they want their children to watch and password protect any content they feel is unsuitable.

The announcement does not mention how this will effect Sky’s relationship with the BBFC, presumably this is a bit of a snub to cinema and video ratings provided by the BBFC.

As an example of Common Sense Media I compared their comments on the Marvel superhero Venom with the more detailed BBFC advice:

MPAA Rated PG-13 for intense sequences of sci-fi violence and action, and for language.What parents need to know

Parents need to know that Venom is a sci-fi action movie based on an antihero/villain from the Marvel universe. Photo journalist Eddie Brock’s (Tom Hardy) life is disrupted for good when he becomes host to an alien parasite. The alien symbiote is able to take over Brock’s body, giving him superpowers but also a dark alter ego called Venom. As his worried girlfriend, Anne (Michelle Williams), watches, Brock struggles with whether to escape the destructive being taking over his body or to give in to its dangerous power. This movie looks darker than most of the Marvel films; expect intense, graphic violence, strong language, and lots of scares.

Rated 15 for strong threat, horror, violenceVENOM is a US sci-fi action fantasy in which alien organisms are brought back to Earth.

Threat

There are a number of sequences in which people are threatened and attacked by the alien organisms, or by people into whose bodies the aliens have entered.

Horror sequences include the alien organisms entering people’s bodies, causing their limbs to distort and their bones to crack. There is sight of injury detail, including protruding bones

Violence

Stronger moments of violence include people being impaled by the alien organisms, sometimes with bloody detail, and people being eaten by the aliens. There is also moderate action violence throughout, including heavy punches, kicks and other blows as well as use of tasers.

There is also infrequent strong language (‘f**k’), alongside milder bad language (eg pussy, shit’). There are sequences in which live animals appear to be eaten but no animals were harmed in the making of the film.

The UK government is preparing to establish a new internet censor that would make tech firms liable for content published on their platforms and have the power to sanction companies that fail to take down illegal material and hate speech within hours.Under legislation being drafted by the Home Office and the Department for Digital, Culture, Media and Sport (DCMS) due to be announced this winter, a new censorship framework for online social harms would be created.

BuzzFeed News has obtained details of the proposals, which would see the establishment of an internet censor similar to Ofcom.

Home secretary Sajid Javid and culture secretary Jeremy Wright are considering the introduction of a mandatory code of practice for social media platforms and strict new rules such as takedown times forcing websites to remove illegal hate speech within a set timeframe or face penalties. Ministers are also looking at implementing age verification for users of Facebook, Twitter, and Instagram.

The new proposals are still in the development stage and are due to be put out for consultation later this year. The new censor would also develop rules new regulations on controlling non-illegal content and online behaviour . The rules for what constitutes non-illegal content will be the subject of what is likely to be a hotly debated consultation.

BuzzFeed News has also been told ministers are looking at creating a second new censor for online advertising. Its powers would include a crackdown on online advertisements for food and soft drink products that are high in salt, fat, or sugar.

BuzzFeed News understands concerns have been raised in Whitehall that the regulation of non-illegal content will spark opposition from free speech campaigners and MPs. There are also fears internally that some of the measures being considered, including blocking websites that do not adhere to the new regulations, are so draconian that they will generate considerable opposition.

A government spokesperson confirmed to BuzzFeed News that the plans would be unveiled later this year.

The European Court of Human Rights (ECtHR) has found that the UK’s mass surveillance programmes, revealed by NSA whistleblower Edward Snowden, did not meet the quality of law requirement and were incapable of keeping the interference to what is necessary in a democratic society.

The landmark judgment marks the Court’s first ruling on UK mass surveillance programmes revealed by Mr Snowden. The case was started in 2013 by campaign groups Big Brother Watch, English PEN, Open Rights Group and computer science expert Dr Constanze Kurz following Mr Snowden’s revelation of GCHQ mass spying.

Documents provided by Mr Snowden revealed that the UK intelligence agency GCHQ were conducting population-scale interception, capturing the communications of millions of innocent people. The mass spying programmes included TEMPORA, a bulk data store of all internet traffic; KARMA POLICE, a catalogue including a web browsing profile for every visible user on the internet; and BLACK HOLE, a repository of over 1 trillion events including internet histories, email and instant messenger records, search engine queries and social media activity.

The applicants argued that the mass interception programmes infringed UK citizens’ rights to privacy protected by Article 8 of the European Convention on Human Rights as the population-level surveillance was effectively indiscriminate, without basic safeguards and oversight, and lacked a sufficient legal basis in the Regulation of Investigatory Powers Act (RIPA).

In its judgment, the ECtHR acknowledged that bulk interception is by definition untargeted ; that there was a lack of oversight of the entire selection process, and that safeguards were not sufficiently robust to provide adequate guarantees against abuse.

In particular, the Court noted concern that the intelligence services can search and examine “related communications data” apparently without restriction — data that identifies senders and recipients of communications, their location, email headers, web browsing information, IP addresses, and more. The Court expressed concern that such unrestricted snooping could be capable of painting an intimate picture of a person through the mapping of social networks, location tracking, Internet browsing tracking, mapping of communication patterns, and insight into who a person interacted with.

The Court acknowledged the importance of applying safeguards to a surveillance regime, stating:

In view of the risk that a system of secret surveillance set up to protect national security may undermine or even destroy democracy under the cloak of defending it, the Court must be satisfied that there are adequate and effective guarantees against abuse.’

The Government passed the Investigatory Powers Act (IPA) in November 2016, replacing the contested RIPA powers and controversially putting mass surveillance powers on a statutory footing.

However, today’s judgment that indiscriminate spying breaches rights protected by the ECHR is likely to provoke serious questions as to the lawfulness of bulk powers in the IPA.

Jim Killock, Executive Director of Open Rights Group said:

Viewers of the BBC drama, the Bodyguard, may be shocked to know that the UK actually has the most extreme surveillance powers in a democracy. Since we brought this case in 2013, the UK has actually increased its powers to indiscriminately surveil our communications whether or not we are suspected of any criminal activity.

In light of today’s judgment, it is even clearer that these powers do not meet the criteria for proportionate surveillance and that the UK Government is continuing to breach our right to privacy.

Silkie Carlo, director of Big Brother Watch said:

This landmark judgment confirming that the UK’s mass spying breached fundamental rights vindicates Mr Snowden’s courageous whistleblowing and the tireless work of Big Brother Watch and others in our pursuit for justice.

Under the guise of counter-terrorism, the UK has adopted the most authoritarian surveillance regime of any Western state, corroding democracy itself and the rights of the British public. This judgment is a vital step towards protecting millions of law-abiding citizens from unjustified intrusion. However, since the new Investigatory Powers Act arguably poses an ever greater threat to civil liberties, our work is far from over.

Antonia Byatt, director of English PEN said:

This judgment confirms that the British government’s surveillance practices have violated not only our right to privacy, but our right to freedom of expression too. Excessive surveillance discourages whistle-blowing and discourages investigative journalism. The government must now take action to guarantee our freedom to write and to read freely online.

Dr Constanze Kurz, computer scientist, internet activist and spokeswoman of the German Chaos Computer Club said:

What is at stake is the future of mass surveillance of European citizens, not only by UK secret services. The lack of accountability is not acceptable when the GCHQ penetrates Europe’s communication data with their mass surveillance techniques. We all have to demand now that our human rights and more respect of the privacy of millions of Europeans will be acknowledged by the UK government and also by all European countries.

The Court has put down a marker that the UK government does not have a free hand with the public’s communications and that in several key respects the UK’s laws and surveillance practices have failed. In particular, there needs to be much greater control over the search terms that the government is using to sift our communications. The pressure of this litigation has already contributed to some reforms in the UK and this judgment will require the UK government to look again at its practices in this most critical of areas.