Right of access to data about oneself is recognised as a fundamental right (article 8(2) of the Charter of Fundamental Rights of the European Union*). Section 7 of the UK’s Data Protection Act 1998 (DPA) gives expression to this, and provides that as a general right individuals are entitled to be told whether someone else is processing their data, and why, and furthermore (in terms) to be given a copy of that data. The European General Data Protection Regulation retains and bolsters this right, and recognises its importance by placing it in the category of provisions non-compliance with which could result in an administrative fine for a data controller of up to €20m or 4% of turnover (whichever is higher).

So subject access is important, and this is reflected in the fact that it is almost certainly the most litigated of provisions of the DPA (a surprisingly under-litigated piece of legislation). Many data controllers need to commit significant resources to comply with it, and the Information Commissioner’s Office (ICO) produced a statutory code of practice on the subject in 2014.

But it is not an absolute right. The DPA explains that there are exemptions to the right where, for instance, compliance would be likely to prejudice the course of criminal justice, or national security, or, in the case of health and social care records, would be likely to cause serious harm to the data subject or another person. Additionally the DPA recognises that, where complying with a subject access request would involve disclosing information about another individual, the data controller should not comply unless that other person consents, or unless it “is reasonable in all the circumstances to comply with the request without the consent of the other individual” (section 7(4) DPA).

But this important caveat (the engagement of the parallel rights of third parties) to the right of subject access is something which is almost entirely omitted in the government’s own web guidance regarding access to CCTV footage of oneself. It says

The CCTV owner must provide you with a copy of the footage that you can be seen in. They can edit the footage to protect the identities of other people.

The latter sentence is true, and especially in the case where footage captures third parties it is often appropriate to take measures to mask their identities. But the first sentence is simply not true. And I think it is concerning that “the best place to find government services and information” (as gov.uk describes itself) is wrong in its description of a fundamental right.

A data controller (let’s ignore the point that a “CCTV owner” might not necessarily be the data controller) does not have an unqualified obligation to provide information in response to a subject access request. As anyone working in data protection knows, the obligation is qualified by a number of exemptions. The page does allude to one of these (at section 29 of the DPA):

They can refuse your request if sharing the footage will put a criminal investigation at risk

What I don’t understand is why the gov.uk page fails to provide better (accurate) information, and why it doesn’t provide a link to the ICO site. I appreciate that the terms and condition of gov.uk make clear that there is no guarantee that information is accurate, but I think there’s a risk here that data subjects could gather unreasonable expectations of their rights, and that this could lead to unnecessary grievances or disputes with data controllers.

Gov.uk invite comments about content, and I will be taking up this invitation. I hope they will amend.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

I’m good at a few things in life, OK at a few more, and pretty terrible at a lot. Into the last category falls car maintenance. Nonetheless, as a safety-conscious person I understand its importance. And so it was that I found myself in a local branch of a major retailer of car parts the other day buying a replacement headlamp bulb, and asking for it to be fitted (by the very helpful Louise – sorry Louise, I won’t be submitting the online customer feedback, for reasons which will probably become clear in this post). I paid for the service, and was then asked

Can I just have your email address to send the receipt?

Er, no.

I’d heard about this practice, but, oddly, this was the first time I’d encountered it. It was immediately obvious to me what was going on, or at least what I assumed was/is going on, but I thought it might be helpful to draw attention to it.

The law (regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (as amended)) outlaws the sending of unsolicited email marketing to individuals, unless the recipient has previously consented to receive the marketing. As much as this law is regularly flouted, it is both clear and strict. It is, however, subject to an important caveat – email marketing can be sent if the sender has obtained the recipient’s email address “in the course of the sale or negotiations for the sale of a product or service to that recipient”.

This is known as the “soft opt-in” and it seems clear to me that the practice of sending e-receipts is tied up with the gathering of email addresses for the purposes of sending marketing using the soft opt-in provisions. As much as we might be told how helpful it is for our own records management to have electronic copies of receipts, there is something in it for retailers, and that something is the perceived right to send electronic marketing.

I should add, though, that soft opt-in is subject to further qualifications – the marketing must be in respect of “similar products and services only”, and, crucially, at the point when the contact details are collected, the intended recipient must be given the chance to say “no” to the marketing. (See the guidance from the Information Commissioner’s Office for further details).

I wasn’t given the chance to say “no”, but I chose not to give my details. If I had given those details, and if I had then received email marketing, it would have been sent unlawfully. I would have known that, but a lot of people wouldn’t, and, importantly, it’s quite difficult to prove (or remember) whether one was given “a simple means of refusing” marketing at the time the sale was made. So it’s a relatively low-risk tactic for marketers.

So my advice is to say no to e-receipts, demand a paper one, and if you do want to retain a record, why not just photograph the receipt when you get home?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

On 24 June, as 48% of the UK was holding its head in its hands and wondering what the hell the other 52% had done, the Information Commissioner’s Office (ICO) issued a statement. It said

If the UK is not part of the EU, then upcoming EU reforms to data protection law would not apply directly to the UK. But if the UK wants to trade with the Single Market on equal terms we would have to prove ‘adequacy’ – in other words UK data protection standards would have to be equivalent to the EU’s General Data Protection Regulation framework starting in 2018.

Over the coming weeks we will be discussing with Government the implications of the referendum result and its impact on data protection reform in the UK.

With so many businesses and services operating across borders, international consistency around data protection laws and rights is crucial both to businesses and organisations and to consumers and citizens. The ICO’s role has always involved working closely with regulators in other countries, and that will continue to be the case.

Having clear laws with safeguards in place is more important than ever given the growing digital economy, and we will be speaking to government to present our view that reform of the UK law remains necessary.

One notes that references to adequacy, and equivalence with the General Data Protection Regulation, have disappeared. And one wonders why – does the ICO now think that a post-Brexit UK would not need to have equivalent standards to the GDPR? If so, that would certainly represent a bold position. In a response to a request for a comment an ICO spokesperson informed me that

We noted the debates about different options that emerged following the referendum result and we decided to move to a simpler statement to avoid being too closely associated to any one particular position

I’m grateful to them for this, and it is in itself very interesting. Privacy Laws and Business recently informed their news feed subscribers that the government is keen to hear from stakeholders their views on the future of the UK data protection regime, so maybe everything is up for grabs.

But a fundamental point remains: if the EU (and indeed the CJEU – see Schrems et al) currently has exacting data protection standards for external states to meet to secure trading rights, realistically could the UK adopt a GDPR-lite regime? It strikes me as a huge risk if we did. But then again, voting for Brexit struck me as a huge (and pointless) risk, and look what happened there.

Ultimately, I’m surprised and disappointed the ICO have resiled from their initial clear and sensible statement. I would have preferred that, rather than “noting the debates” about post-Brexit data protection, they actually directed and informed those debates.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

[Edited to add: it is well worth reading the comments to this piece, especially the ones from Chris Pounder and Reuben Binns]

I needed a way to break a blogging drought, and something that was flagged up to me by a data protection colleague (thanks Simon!) provides a good opportunity to do so. It suggests that the drafting of the GDPR could lead to an enormous workload for the ICO.

The General Data Protection Regulation (GDPR) which entered into force on 24 May this year, and which will apply across the European Union from 25 May 2018, mandates the completion of Data Protection Impact Assessments (DPIAs) where indicated. Article 35 of the GDPR explains that

Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data

In the UK (and indeed elsewhere) we already have the concept of “Privacy Impact Assessments“, and in many ways all that the GDPR does is embed this area of good practice as a legal obligation. However, it also contains some ancillary obligations, one of which is to consult the supervisory authority, in certain circumstances, prior to processing. And here is where I get a bit confused.

Article 36 provides that

The controller shall consult the supervisory authority prior to processing where a data protection impact assessment under Article 35 indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk[emphasis added]

A close reading of Article 36 results in this: if the data controller conducts a DPIA, and is of the view that if mitigating factors were not in place the processing would be high risk, it will have to consult supervisory authority (in the UK, the Information Commissioner’s Office (ICO)). This is odd: it effectively renders any mitigating measures irrelevant. And it appears directly to contradict what recital 84 says

Where a data-protection impact assessment indicates that processing operations involve a high risk which the controller cannot mitigate by appropriate measures in terms of available technology and costs of implementation, a consultation of the supervisory authority should take place prior to the processing [emphasis added]

So, the recital says the obligation to consult will arise where high risk is involved which the controller can’t mitigate, while the Article says the obligation will arise where high risk is involved notwithstanding any mitigation in place.

Clearly, the Article contains the specific legal obligation (the recital purports to set out the reason for the contents of the enacting terms), so the law will require data controllers in the UK to consult the ICO every time a DPIA identifies an inherently high risk processing activity, even if the data controller has measures in place fully to mitigate and contain the risk.

For example, let us imagine the following processing activity – collection of and storage of customer financial data for the purposes of fulfilling a web transaction. The controller might have robust data security measures in place, but Article 36 requires it to consider “what if those robust measures were not in place? would the processing be high risk?” To which the answer would have to be “yes” – because the customer data would be completely unprotected.

In fact, I would submit, if article 36 is given its plain meaning virtually any processing activity involving personal data, where there is an absence of mitigating measures, would be high risk, and create a duty to consult the ICO.

What this will mean in practice remains to be seen, but unless I am missing something (and I’d be delighted to be corrected if so), the GDPR is setting the ICO and other supervisory authorities up for a massive influx of work. With questions already raised about the ICO’s funding going forward, that is the last thing they are likely to need.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

There is some irony in the quite extraordinary news that the Independent Commission on Freedom of Information received 30,000 submissions in response to its public call for written evidence: one of the considerations in the call for evidence was the fact that “reading time” cannot currently be factored in as one of the tasks which determines whether a request exceeds the cost limit under section 12 of the FOI Act.

Lord Burns has now announced that

Given the large volume of evidence that we have received, it will take time to read and consider all of the submissions

Well, yes. The Commission originally planned to report its findings “before the end of the year” (that is, the parliamentary year, which ends on 17 December). It also planned to read all the evidence which was before the Justice Committee when it conducted its post-legislative scrutiny of FOIA in 2012, and there was a fairamount of that. But let us put that to one side, and let us estimate that reading and where necessary taking a note of each of the current 30,000 submissions will take someone ten minutes (as some submissions were 400 pages long, this is perhaps a ridiculously conservative estimate). That equates to 300,000 minutes, or 5000 hours, or 208 days of one person’s time (assuming they never slept or took a break: if we imagine that they spent eight hours reading every day, it would be 625 days).

I don’t know what sort of administrative support Lord Burns and his fellow Commission members have been given, but, really, to do their job properly one would expect them to read the submissions themselves. There are five of them, so even assuming they shared the reading between them, we might expect they would between them take 125 days (without a break, and with little or no time to undertake their other jobs and responsibilities) to digest the written evidence.

Lord Burns has sensibly conceded that the Commission will not be able to report by the end of the year, and he has announced that two oral evidence sessions will take place in January next year (although who will participate has not been announced, nor whether the sessions will be broadcast, nor even whether they will take place in public).

What is clear though is that someone or ones has a heck of a job ahead of them. I doubt that the Commission, as an advisory non-departmental public body, would be amenable to judicial review, so it is probably not strictly bound by public law duties to take all relevant evidence into account when arriving at its decisions and recommendations, but, nonetheless, a failure so to do would open it up to great, and justified, criticism.

And, one final point, as Ian Clark noticed when submitting his evidence, the web form was predicated on the assumption that those making submissions would only be from an “organisation”. Surely the Commission didn’t assume that the only people with views on the matter were those who received FOI requests? Surely they didn’t forget that, ultimately, FOIA is for the public?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

…if there is one matter on which I feel more strongly than another it is that in a democratic community the foundation of good government lies in freedom of information, freedom of thought, and freedom of speech: You can not have a country, which is governed by its people, wisely and well governed, unless those people are permitted access to accurate information, and are permitted the free exchange of their views and their opinions: That is essential to good government: It is quite true that if you grant that freedom there will be abuses: It is quite true that foolish people advocate foolish views: That is one of the many unfortunate corollaries

Although the past is a foreign country, some of its citizens can seem familiar: the quotation above is from Liberal politician Sir Richard Durning Holt, and was made in a parliamentary debate seven months short of a hundred years ago. It contains the first recorded parliamentary use of the term “freedom of information”. It was said as part of a debate about conscientious objectors to the “Great War” (Holt was drawing attention to what he saw as the unfair and counter-productive prosecutions of objectors). He may not have meant “freedom of information” in quite the way we mean it now, but his words resonate, and – at a time when our own Freedom of Information Act 2000 is under threat – remain, as a matter of principle, remarkably relevant.

“Don’t panic” says David to those data controllers who are currently relying on Safe Harbor as a means of ensuring that personal data transferred by them to the United States has adequate protection (in line with the requirements of Article 25 of the European Data Protection Directive, and the eighth principle of schedule one of the UK’s Data Protection Act 1998 (DPA)). He is referring, of course, to the recent decision of the Court of Justice of the European Union in Schrems. which Data controllers should, he says, “take stock” and “make their own minds up”:

businesses in the UK don’t have to rely on Commission decisions on adequacy. Although you won’t get the same degree of legal certainty, UK law allows you to rely on your own adequacy assessment. Our guidance tells you how to go about doing this. Much depend [sic] here on the nature of the data that you are transferring and who you are transferring it to but the big question is can you reduce the risks to the personal data, or rather the individuals whose personal data it is, to a level where the data are adequately protected after transfer? The Safe Harbor can still play a role here.

Smith also refers to a recent statement by the Article 29 Working Party – the grouping of representatives of the various European data protection authorities, of which he is a member – and refers to “the substance of the statement being measured, albeit expressed strongly”. What he doesn’t say is how unequivocal it is in saying that

transfers that are still taking place under the Safe Harbour decision after the CJEU judgment are unlawful

And this is particularly interesting because, as I discovered today, the ICO itself appears (still) to be making transfers under Safe Harbor. I reported a nuisance call using its online tool (in doing so I included some sensitive personal data about a family member) and noticed that the tool was operated by SnapSurveys. The ICO’s own website privacy notice says

We collect information volunteered by members of the public about nuisance calls and texts using an online reporting tool hosted by Snap Surveys. This company is a data processor for the ICO and only processes personal information in line with our instructions.

This does not unambiguously say that SnapSurveys are transferring the personal data of those submitting reports to the ICO to the US under Safe Harbor – it is possible that the ICO has set up some bespoke arrangement with its processor, under which they process that specific ICO data within the European Economic Area – but it strongly suggests it.

It is understandable that a certain amount of regulatory leeway and leniency be offered to data controllers who have relied on Safe Harbor until now – to that extent I agree with the light-touch approach of the ICO. But if it is really the case that peoples’ personal data are actually being transferred by the regulator to the US, three weeks after the European Commission decision of 2000 that Safe Harbor provided adequate protection was struck down, serious issues arise. I will be asking the ICO for confirmation about this, and whether, if it is indeed making these transfers, it has undertaken its own adequacy assessment.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Talk Talk, in response to the recent revelations about the compromising of the data of up to four million of its customers, says rather boldly

Has TalkTalk breached the Data Protection Act?
No, this is a criminal attack. We have notified the ICO and we will work closely with them over the coming weeks and months.

And it got me to wondering how well this rather novel approach could be extended in other legal areas.

The defendant, a Mr Talk Talk, was travelling at a speed of ninety-four miles per hour, and had consumed the equivalent of two bottles of gin. However, as the other driver involved in the collision had failed to renew his motor insurance we find that the defendant was evidently merely the victim of a crime, and my client could not, as a matter of law, have broken speeding and drink driving laws.

Furthermore, although the defendant later viciously kicked an elderly bystander in a motiveless attack, he cannot be guilty of an assault because the pensioner was recently convicted of watching television without a licence.

And finally, although my client picked up a police officer and threw him into a duck pond, the fact that the said officer once forgot to pay for a milky way in the staff canteen provides an absolute defence to the charge of obstructing a police officer in the line of duty.

Over a year ago I blogged about a tweet by a member of the Oyston family connected with Blackpool FC:

a fan replies to a news item about the club’s manager, and calls the Oyston family “wankers”. Sam Oyston responds by identifying the seat the fan – presumably a season-ticket holder – occupies, and implies that if he continues to be rude the ticket will be withdrawn

For the reasons in that post I thought this raised interesting, and potentially concerning, data protection issues, and I mentioned that the Information Commissioner’s Office (ICO) had powers to take action. It was one of (perhaps the) most read posts (showing, weirdly, that football is possibly more of interest to most people than data protection itself) and it seemed that some people did intend complaining to the ICO. So, recently, I made an FOI request to the ICO for any information held by them concerning Blackpool FC’s data protection compliance. This was the reply

We have carried out thorough searches of the information we hold and have identified one instance where a member of the public raised concerns with the ICO in September 2014, about the alleged processing of personal data by Blackpool FC.

We concluded that there was insufficient evidence to consider the possibility of a s55 offence under the Data Protection Act 1998 (the DPA), and were unable to make an assessment as the individual had not yet raised their concerns with Blackpool FC direct. We therefore advised the individual to contact the Club and to come back to us if they were still concerned, however we did not hear from them again. As such, no investigation took place, nor was any assessment made of the issues raised.

This suggests the ICO appears wrongly to consider itself unable to undertake section 42 assessments under the Data Protection Act 1998 unless the data subject has complained to the data controller – a stance strongly criticised by Dr David Erdos on this blog, and one which has the potential to put the data subject further in dispute with the data controller (as I can imagine could have happened here, with a family some of whose members are ready to sue to protect their reputation). It also suggests though that maybe people weren’t quite as interested as the page views suggested. Nonetheless, I am posting this brief update, because a few people asked about it.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Some time ago I complained to the Information Commissioner’s Office (ICO) about the innuendo carried in the message that Google serves with search results on most personal names: “Some results may have been removed under data protection law in Europe”. I had already complained to Google UK, and wrote about it here. Google UK denied any responsibility or liability, and referred me to their enormous, distant, parents at 1600 Amphitheatre Parkway. I think they were wrong to do so, in light of the judgment of the Court of Justice of the European Union in the Google Spaincase C‑131/12, but I will probably pursue that separately.

However, section 42 of the Data Protection Act 1998 (DPA) allows me to ask the ICO to assess whether a data controller has likely or not complied with its obligations under the DPA. So that’s what I did (pointing out that a search on “Jon Baines” or “Jonathan Baines” threw up the offending message).

In her response the ICO case officer did not address the jurisdiction point which Google had produced, and nor did she actually make a section 42 assessment (in fairness, I had not specifically cited section 42). What she did say was this

As you know, the Court of Justice of the European Union judgement in May 2014 established that Google was a data controller in respect of the processing of personal data to produce search results. It is not in dispute that some of the search results do relate to you. However, it is also clear that some of them will relate to other individuals with the same name. For example, the first result returned on a search on ‘Jonathan Baines’ is ‘LinkedIn’, which says in the snippet that there are 25 professionals named Jonathan Baines, who use LinkedIn.

It is not beyond the realms of possibility that one or more of the other individuals who share your name have had results about them removed. We cannot comment on this. However, we understand that this message appears in an overwhelming majority of cases when searching on any person’s name. This is likely to be regardless of whether any links have actually been removed.

True, I guess. Which is why I’ve reverted with this clarification of my complaint:

If it assists, and to extend my argument and counter your implied question “which Jon Baines are we talking about?”, if you search < “Jon Baines” Information Rights and Wrongs > (where the search term is actually what lies between the < >) you will get a series of results which undoubtedly relate to me, and from which I can be identified. Google is processing my personal data here (that is unavoidable a conclusion, given the ruling by the Court of Justice of the European Union in “Google Spain” (Case C‑131/12)). The message “Some results may have been removed under data protection law in Europe” appears as a result of the processing of my personal data, because it does not appear on every search (for instance < prime minister porcine rumours > or < “has the ICO issued the cabinet office an enforcement notice yet” >). As a product of the processing of my personal data, I argue that the message relates to me, and constitutes my personal data. As it carries an unfair innuendo (unfair because it implies I might have asked for removal of search results) I would ask that you assess whether Google have or have not likely complied with their obligation under section 4(4) to comply with the first and fourth data protection principles. (Should you doubt the innuendo point, please look at the list of results on a Twitter search for “Some results may have been removed”).

Let’s hope this allows the ICO to make the assessment, without my having to consider whether I need to litigate against one of the biggest companies in world history.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.