One waits with interest to see whether the [Information Commissioner’s Office (ICO)] will take any enforcement action, but I think it’s important that they consider doing so, because, even though Sir Nick makes nine very sensible recommendations to HSCIC, one could be forgiven – having been given clear assurances previously, by the likes of Tim Kelsey and others – for having reservations as to future governance of our confidential medical data

Now, with the launch of the first annual report of the Independent Information Governance Oversight Panel (IIGOP), chaired by Dame Fiona Caldicott and established at the request of the Secretary of State to “advise, challenge and report on the state of information governance across the health and care system in England”, we see further evidence of HES data “being compromised, the privacy of patients being compromised”. The report informs us of an incident whereby

New inspection procedures introduced by the HSCIC had uncovered a number of organisations which were sending HES data and failing to follow data dictionary standards. This meant they were inadvertently enabling personal confidential data to enter the data base. Following an alert to the Information Commissioners’ Office this was understood as a large scale problem, although having a low level potential impact, as the affected data fields were unknown to either senders or receivers of HES data. The relevant organisations were contacted to gain their cooperation in closing the breach, without alerting any unfriendly observer to the location of the confidential details. This was important to preserve the general ignorance of the detail of the breach and continue to protect individuals’ privacy. Trusts and others were encouraged to provide named contacts who would then start cleaning up their data flows to the HSCIC. In order to manage any untoward reporting in the media, trade titles were informed and briefed about the importance of restricting their reporting to avoid any risk of leading people towards this confidential data.

Now this to me seems pretty serious: those organisations who failed to “follow data dictionary standards” by data controller organisations who were sending HES data sounds very likely to be a contravention of the data controllers’ obligation, under section 4(4) of the Data Protection Act 1998 (DPA) to comply with the seventh data protection principle, which requires that they take

Serious contraventions, of a kind likely to cause substantial damage or substantial distress, can result in the ICO serving a monetary penalty notice, under section 55A of the DPA, to a maximum of £500,000.

So, what does one make of these incidents? It’s hard to avoid the conclusion that they would be held to be “serious”, and if the data in question had been misused, there would have been the potential for substantial damage and substantial distress – public disclosure of hospital record data could have a multitude of pernicious effects – and this much is evidenced by the fact that (successful) attempts had to be made to avoid the errors coming to light, including asking journalists to avoid reporting. But were they contraventions likely to cause these things? IIGOP suggests that they had a “low level potential impact” because the data was hidden within large amounts of non-offensive data, and I think it is probably the case that the incidents would not be held to have been likely to cause substantial damage or substantial distress (in Niebel, the leading case on monetary penalty notices, Wikeley J in the Upper Tribunal accepted that the likely in s55A DPA took the same meaning attributed to it by Munby J, in R (Lord) v Secretary of State for the Home Department [2003] EWHC 2073 (Admin), namely “‘likely’ meant something more than ‘a real risk’, i.e. a significant risk, ‘even if the risk falls short of being more probable than not'”).

But a monetary penalty notice is not the only action open to the ICO. He has the power to serve enforcement notices, under s40 DPA, to require data controllers to do, or refrain from doing, specified actions, or to take informal action such as requiring the signing of undertakings (to similar effect). Given that we have heard about these incidents from IIGOP, and in an annual report, it seems unlikely that any ICO enforcement action will be forthcoming. Perhaps that’s correct as a matter of law and as a matter of the exercise of discretion, but in my view the ICO has not been vocal enough about the profound issues raised by the amalgamation and sharing of health data, and the concerns raised by incidents of potentially inappropriate or excessive processing. Care.data of course remains on the agenda, and the IIGOP report is both revealing and encouragingly critical of what has taken place so far, but one would not want a situation to emerge where the ICO took a back seat and allowed IIGOP (which lacks regulatory and enforcement powers) to deal with the issue.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

UPDATE: 23.01.15 The ICO has responded [.doc file] to my request for a review of their decision. I drew their attention to the arguments on this page but they don’t even mention them, let alone provide a counter-analysis, in dismissing my complaints (“Having reviewed the matter, I agree with the explanations provided”). I am invited by the ICO to consider taking my own legal action. I understand that the ICO and I might have differing views on a DPA matter, but what I find difficult to accept is the refusal even to enter into a discussion with me about the detailed arguments I’ve made. END UPDATE

In February this year I asked the Information Commissioner’s Office (ICO) to investigate reports that Hospital Episode Statistics (HES) data had apparently been sold to an actuarial society by the NHS Information Centre (NHSIC), the predecessor to the Health and Social Care Information Centre (HSCIC). Specifically I requested, as a data subject can under s42 of the Data Protection Act 1998 (DPA), that the ICO assess whether it was likely or not that the processing of my personal data by NHSIC and others had been in compliance with the DPA.

Nine months later, I was still awaiting the outcome. But a clue to how the assessment would turn out was contained in the text of Sir Nick Partridge’s six month review of various data releases by NHSIC (his original report in June seemed to me to point to multiple potential DPA contraventions). In the review document he says

Six investigations have been separately instigated by the HSCIC or Information Commissioner’s Office (ICO)and shared with both parties as these focussed on whether individuals were at risk of being identified. In the cases it has investigated, the ICO has upheld the HSCIC approach and informed us that it has “seen no evidence to suggest that re-identification has occurred or is reasonably likely to occur.”

And sure enough, after chasing the ICO for the outcome of my nine-month wait, I received this (in oddly formatted text, which rather whiffed of a lot of cutting-and-pasting)

Following the recent issue regarding HSCIC, PA Consulting, and Google we investigated the issue of whether HES data could be considered personal data. This detailed work involved contacting HSCIC, PA Consulting, and Google and included the analysis of the processes for the extraction and disclosure of HES data both generally and in that case in particular. We concluded that we did not consider that the HES dataset constitutes personal data.Furthermore we also investigated whether this information had been linked to other data to produce “personal data” which was subject to the provisions of the Act. We have no evidence that there has been any re-identification either on the part of PA Consulting or Google. We also noted that HSCIC have stated that the HES dataset does not include individual level patient data even at a pseudonymised level. Our view is that the data extracted and provided to PA Consulting did not identify any individuals and there was no reasonable likelihood that re-identification would be possible.

I have added the emphasis to the words “reasonable likelihood” above. They appear in similar terms in the Partridge Review, and they struck me as rather odd. An awful lot of analysis has taken and continues to take place on the subject of when can personal data be “rendered fully anonymous in the sense that it is information from which the data subject is no longer identifiable” (Lord Hope’s dicta in Common Services Agency v Scottish Information Commissioner[2008] UKHL 47). Some of that analysis has been academic, some takes the form of “soft law” guidance, for instance Opinion 05/2014 of the Article 29 Working Party, and the ICO Anonymisation Code of Practice. The former draws on the Data Protection Directive 95/46/EC, and notes that

Recital 26 signifies that to anonymise any data, the data must be stripped of sufficient elements such that the data subject can no longer be identified. More precisely, that data must be processed in such a way that it can no longer be used to identify a natural person by using “all the means likely reasonably to be used”

Anonymisation has also been subject to judicial analysis, notably in the Common Services Agency case, but, even more key, in the judgment of Mr Justice Cranston in Department of Health v Information Commissioner ([2011] EWHC 1430). The latter case, involving the question of disclosure of late-term abortion statistics, is by no means an easy judgment to parse (ironically so, given that it makes roughly the same observation of the Common Services Agency case). The judge held that the First-tier Tribunal had been wrong to say that the statistics in question were personal data, but that it had on the evidence been entitled to say that “the possibility of identification by a third party from these statistics was extremely remote”. The fact that the possibility of identification by a third party was extremely remote meant that “the requested statistics were fully anonymised” (¶55). I draw from this that for personal data to be anonymised in statistical format the possibility of identification of individuals by a third party must be extremely remote. The ICO’s Anonymisation Code, however, says of the case:

The High Court in the Department of Health case above stated that the risk of identification must be greater than remote and reasonably likely for information to be classed as personal data under the DPA [emphasis added]

But this seems to me to be an impermissible description of the case – the High Court did not state what the ICO says it stated – the phrases “greater than remote” and “reasonably likely” do not appear in the judgment. And that phrase “reasonably likely” is one that, as I say, makes it way into the Partridge Review, and the ICO’s assessment of the lawfulness of HES data “sale”.

I being to wonder if the ICO has taken the phrase from recital 26 of the Directive, which talks about the need to consider “all the means likely reasonably to be used” to identify an individual, and transformed it into a position from which, if identification is not reasonably likely, it will accept that data are anonymised. This cannot be right: there is a world of difference between a test which considers whether possibility of identification is “extremely remote” and whether it is “reasonably likely”.

I do not have a specific right to a review of the section 42 assessment decision that the processing of my personal data was likely in compliance with NHSIC’s obligations under the DPA, but I have asked for one. I am aware of course that others complained (après moi, la deluge) notably, in March, FIPR, MedConfidential and Big Brother Watch . I suspect they will also be pursuing this.

In October this year I attended an event at which the ICO’s Iain Bourne spoke. Iain was a key figure in the drawing up of the ICO’s Anonymisation Code, and I took the rather cheeky opportunity to ask about the HES investigations. He said that his initial view was that NHSIC had been performing good anonymisation practice. This reassured me at the time, but now, after considering this question of whether the Anonymisation Code (and the ICO) adopts the wrong test on the risks of identification, I am less reassured. Maybe “reasonably likely that an individual can be identified” is an appropriate test for determining when data is no longer anonymised, and becomes personal data, but it does not seem to me that the authorities support it.

Postscript Back in August of this year I alerted the ICO to the fact that a local authority had published open data sets which enabled individuals to be identified (for instance, social care and housing clients). More than four months later the data is still up (despite the ICO saying they would raise the issue with the council): is this perhaps because the council has argued that the risk of identification is not “reasonably likely”?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

However, with the publication on 17 June of Sir Nick Partridge’s Review of Data Releases by the NHS Information Centre one questions the basis for Tim’s assertions. Sir Nick commissioned PwC to analyse a total of 3059 data releases between 2005 and 2013 (when the NHS Information Centre (NHSIC) ceased to exist, and was replaced by the Health and Social Care Information Centre HSCIC). The summary report to the Review says that

It disappoints me to report that the review has discovered lapses in the strict arrangements that were supposed to be in place to ensure that people’s personal data would never be used improperly

and it reveals a series of concerning and serious failures of data governance, including

lack of detailed records between 1 April 2005 and 31 March 2009

two cases of data that was apparently released without a proper record remaining of which organisation received the data

[no] evidence that Northgate [the NHSIC contractor responsible for releases] got permission from the NHS IC before making releases as it was supposed to do

PwC could not find records to confirm full compliance in about 10% of the sample

Sir Nick observes that

the system did not have the checks and balances needed to ensure that the appropriate authority was always in place before data was released. In many cases the decision making process was unclear and the records of decisions are incomplete.

and crucially

It also seems clear that the responsibilities of becoming a data controller, something that happens as soon as an organisation receives data under a data sharing agreement, were not always clear to those who received data. The importance of data controllers understanding their responsibilities remains vital to the protection of people’s confidentiality

(This resonates with my concern, in my request to the ICO to assess the transfer of data from HES to the actuarial society, about what the legal basis was for the latter’s processing).

Notably, Sir Nick dispenses with the idea that data such as HES was anonymised:

The data provided to these other organisations under data sharing agreements is not anonymised. Although names and addresses are normally removed, it is possible that the identity of individuals may be deduced if the data is linked to other data

And if it was not anonymised, then the Data Protection Act 1998 (DPA) is engaged.

All of this indicates a failure to take appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data, which the perspicacious among you will identify as one of the key statutory obligations placed on data controllers by the seventh data protection principle in the DPA.

Sir Nick may say

It is a matter of fact that no individual ever complained that their confidentiality had been breached as a result of data being shared or lost by the NHS IC

but simply because no complaint was made (at the time – complaints certainly have been made since concerns started to be raised) does not mean that the seventh principle was not contravened, in a serious way. And a serious contravention of the DPA of a kind likely to cause substantial damage or substantial distress can potentially lead to the ICO serving a monetary penalty notice (MPN) to a maximum of £500,000 (at least for contraventions after April 2010, when the ICO’s powers commenced).

The NHSIC is no more (although as Sir Nick says, HSCIC “inherited many of the NHS IC’s staff and procedures”). But that has not stopped the ICO serving MPNs on successor organisation in circumstances where their predecessors committed the contravention. One waits with interest to see whether the ICO will take any enforcement action, but I think it’s important that they consider doing so, because, even though Sir Nick makes nine very sensible recommendations to HSCIC, one could be forgiven – having been given clear assurances previously, by the likes of Tim Kelsey and others – for having reservations as to future governance of our confidential medical data. I would suggest it is imperative that HSCIC know that their processing of personal data is now subject to close oversight by all relevant regulatory bodies.

I thought I was rather flogging the care.data horse on this blog, so, in the spirit of persistence, I thought why not go and do it somewhere else? The Society of Computers and Law kindly asked me to write a broadly “anti” piece, while asking Martin Hoskins to do a broadly “pro” one. They are here:

The ICO appear to think that GPs who opt patients out of care.data without informing them would be breaching the Data Protection Act. They say it would be unfair processing

In February of this year GP Dr Gordon Gancz was threatened with termination of his contract, because he had indicated he would not allow his patients’ records to be uploaded to the national health database which as planned to be created under the care.data initiative. He was informed that if he didn’t remove information on his website, and if he went on to add “opt-out codes” to patients’ electronic records, he would be in breach of the NHS (GMS contract) Regulations 2004. Although this threatened action was later withdrawn, and care.data put on hold for six months, Dr Gancz might have been further concerned to hear that in the opinion of the Information Commissioner’s Office (ICO) he would also have been in breach of the Data Protection Act 1998 (DPA).

A few weeks ago fellow information rights blogger Tim Turner (who has given me permission to use the material) asked NHS England about the basis for Health Services Minister Dan Poulter’s statement in Parliament that

NHS England and the Health and Social Care Information Centre will work with the British Medical Association, the Royal College of General Practitioners, the Information Commissioner’s Office and with the Care Quality Commission to review and work with GP practices that have a high proportion of objections [to care.data] on a case-by-case basis

Tim wanted to know what role the ICO would play. NHS England replied saying, effectively, that they didn’t know, but they did disclose some minutes of a meeting held with the ICO in December 2013. Those minutes indicate that

The ICO had received a number of enquiries regarding bulk objections from practices. Their view was that adding objection codes would constitute processing of data in terms of the Data Protection Act. If objection codes had been added without writing to inform their patients then the ICO’s view was that this would be unfair processing and technically a breach of the Act so action could be taken by the ICO

One must stress that this is not necessarily a complete or accurate respresentation of the ICO’s views. However, what appears to be being said here is that, if GPs took the decision to “opt out” their patients from care.data, without writing to inform them, this would be an act of “processing” according to the definition at section 1(1) of the DPA, and would not be compliant with the GPs’ obligations under the first DPA principle to process personal data fairly.

On a very strict reading of the DPA this may be technically correct – for processing of personal data to be fair data subjects must be informed of the purposes for which the data are being processed, and, strictly, adding a code which would prevent an upload (which would otherwise happen automatically) would be processing of personal data. And, of course, the “fairness” requirement is absent from the proposed care.data upload, because Parliament, in its wisdom, decided to give the NHS the legal power to override it. But “fairness” requires a broad brush, and the ICO’s interpretation here would have the distinctly odd effect of rendering unlawful a decision to maintain the status quo whereby patients’ GP data does not leave the confidential confines of their surgery. It also would have the effect of supporting NHS England’s apparent view that GPs who took such action would be liable to sanctions.

In fairness (geddit???!!) to the ICO, if a patient was opted out who wanted to be included in the care.data upload, then I agree that this would be in breach of the first principle, but it would be very easily rectified, because, as we know, it will be simple to opt-in to care.data from a previous position of “opt-out”, but the converse doesn’t apply – once your data is uploaded it is uploaded in perpetuity (see my last bullet point here).

A number of GPs (and of course, others) have expressed great concern at what care.data means for the confidential relationship between doctor and patient, which is fundamental for the delivery of health care. In light of those concerns, and in the absence of clarity about the secondary uses of patient data under care.data, would it really be “unfair” to patients if GPs didn’t allow the data to be collected? Is that (outwith DPA) fair to GPs?

The Sunday Times reports that a billion patient records have been sold to a marketing consultancy. Is it time for an independent review of these highly questionable data sharing practices?

In 2012, at the behest of the then Secretary of State for Health, Andrew Lansley (driver of the Health and Social Care Act 2012), Dame Fiona Caldicott chaired a review of information governance in the NHS. Her report, which focused on the issue of sharing of information, was published in April 2013. At the time a statement in it, referring to the Information Commissioner’s Office (ICO) stood out to me, and it stands out even more now, but for different reasons. It says

The ICO told the Review Panel that no civil monetary penalties have been served for a breach of the Data Protection Act due to formal data sharing between data controllers in any organisation for any purpose

At the time, I thought “Well duh” – of course the ICO is not going to take enforcement action where there has been a formal data sharing agreement, because, clearly, the parties entering into such an agreement are going to make sure they do so lawfully, and with regard to the ICO guidance on data sharing – lawful and proportionate data sharing is, er, lawful, so the ICO wouldn’t be able to take action.

But now, with the frequent and worrying stories emerging of apparent data sharing arrangements between the NHS Information Centre (NHSIC), and its successor, the Health and Social Care Information Centre (HSCIC), I start to think the ICO’s comments are remarkable for what they might reveal about them looking in the wrong direction, when they should have been paying more attention to the lawfulness of huge scale data sharing arrangements between the NHS and private bodies. And now, The Sunday Times reports that

A BILLION NHS records containing details of patients’ hospital admissions and operations have been sold to a marketing consultancy working for some of the world’s biggest drug companies

I think it is time for a wholesale review, properly funded, by the ICO as independent regulator, of these “formal data sharing” arrangements. They appear to have a questionable legal basis, based to a large extent on questionable assumptions and assurances that pseudonymisation equates to anonymisation (which anyone who looks into will realise is nonsense).

And I think the review should also consider how and why these arrangements appear to have deliberately been taking place behind the backs of the patients whose data has been “shared”.

David Evans is Senior Policy Officer at the Information Commissioner’s Office (ICO). In an interview with “The Information Daily.com” uploaded on 12 March, he spoke about data sharing in general, and specifically about care.data (elsewhere on this blogpassim). There’s a video of his interview, which has a backdrop with adverts for “Boilerhouse Health” and “HCI Daily“, both of which appear to be communications companies offering services to the health sector. David says

care.data…the overall project is very good because it’s all about making better use of information in the health service…what care.data appear to have done is failed to get that message across

Oddly, this view, that if only the people behind care.data had communicated its benefits better it would have sailed through, is very similar to that expressed by Tim Kelsey, NHS National Director for Patients and Information and head cheerleader for care.data. Tim said, for instance, after the announcement of a (further) six-month delay in implementation

We have been told very clearly that patients need more time to learn about the benefits of sharing information and their right to object to their information being shared

Both David and Tim are right that there has been a failure of communication, but I think it is completely wrong to see it merely as a failure to communicate the benefits. Any project involving the wholesale upload of confidential medical records, to be processed and disclosed, at various levels of deidentification, to third parties, is going to involve risk, and will necessitate explanation of and mitigation of that risk. What the public have so far had communicated to them is plenty about the benefits, but very little about the risks, and the organisational and technical measures being taken by the various bodies involved to mitigate or contain that risk. Tim Gough has argued eloquently for a comprehensive and independent Privacy Impact Assessment to be undertaken (while criticising the one that was published in January

To be fair, NHS England did publish a PIA in January 2014, which does appear a little late in the day for a project of this kind. It also glosses over information which is extremely important to address in full detail. Leaving it out makes it look like something is being hidden

As far as I am aware there has been no official response to this (other than a tweet from Geraint Lewis referring us to our well-thumbed copies of the ICO’s nearly-superseded PIA Handbook).

To an extent I can understand Tim Kelsey feeling he and his colleagues need to do more to communicate the benefits of care.data – after all, it’s their job to deliver it. But I do have real concerns that a senior officer at the ICO thinks that public concerns can be allayed through yet more plugging of the benefits, with none of the detailed reassurances and legal and technical justifications whose absence has been so strongly noted.

Breaches of the DPA are not always about data security. I’m not sure NHS England have grasped this. Worse, I’m not sure the ICO understands public concern about what is happening with confidential medical information. They both need to listen.

Proponents of the care.data initiative have been keen to reassure us of the safeguards in place for any GP records uploaded to the Health and Social Care Information Centre (HSCIC) by saying that similar data from hospitals (Hospital Episode Statistics, or HES) has been uploaded safely for about two decades. Thus, Tim Kelsey, National Director for Patients and Information in the National Health Service, said on twitter recently that there had been

No data breach in SUS*/HES ever

I’ve been tempted to point out that this is a bit like a thief arguing that he’s been stealing from your pockets for twenty years, so why complain when you catch him stealing from your wallet? However, whether Tim’s claim is true or not partly depends on how you define a “breach”, and I suspect he is thinking of some sort of inadvertent serious loss of data, in breach of the seventh (data security) principle of the Data Protection Act 1998 (DPA). Whether there have been any of those is one issue, and, in the absence of transparency of how HES processing has been audited, I don’t know how he is so sure (an FOI request for audit information is currently stalled, while HSCIC consider whether commercial interests are or are likely to prejudiced by disclosure). But data protection is not all about data security, and the DPA can be “breached” in other ways. As I mentioned last week, I have asked the Information Commissioner’s Office to assess the lawfulness of the processing surrounding the apparent disclosure of a huge HES dataset to the Institute and Faculty of Actuaries, whose Society prepared a report based on it (with HSCIC’s logo on it, which rather tends to undermine their blaming the incident on their NHSIC predecessors). My feeling is that this has nothing, or very little, to do with data security – I am sure the systems used were robust and secure – but a lot to do with some of the other DPA principles, primarily, the first (processing must be fair and lawful and have an appropriate Schedule 2 and Schedule 3 condition), and the second “Personal data shall be obtained only for one or more specified and lawful purposes”).

Since the story about the actuarial report, at least three other possible “breaches” have come to light. They are listed in this Register article, but it is the first that has probably caused the most concern. It appears that the entire HES dataset, pseudonymised (not, note, anonymised) of around one terabyte, was uploaded to Google storage, and processed using Big Query. An apparently rather unconcerned statement from HSCIC (maybe they’ll blame their predecessors again, if necessary) said

The NHS Information Centre (NHS IC) signed an agreement to share pseudonymised Hospital Episodes Statistics data with PA Consulting in November 2011…PA Consulting used a product called Google BigQuery to manipulate the datasets provided and the NHS IC was aware of this. The NHS IC had written confirmation from PA Consulting prior to the agreement being signed that no Google staff would be able to access the data; access continued to be restricted to the individuals named in the data sharing agreement

So that’s OK then? Well, not necessarily. Google’s servers (and, remember “cloud” really means “someone else’s computer”) are dotted around the world, although mostly in the US, and when you upload data to the cloud, one of the problems (or benefits) is you don’t have, or don’t tend to think you have, a real say in where it is hosted. By a certain argument, this even makes the cloud provider, in DPA terms, a data controller, because it is partly determining “the manner in which any personal data are, or are to be, processed”. If the hosting is outside the European Economic Area the eight DPA principle comes into play:

Personal data shall not be transferred to a country or territory outside the European Economic Area unless that country or territory ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data

I don’t know if PA Consulting’s upload of HES data to the cloud was in compliance with their and NHSIC’s/HSCIC’s DPA obligations, but, then again, I’m not the regulator of the DPA. So, in addition to last week’s request for assessment, I’ve asked the ICO to assess this processing as well

Hi again

I don’t yet have any reference number, but please note my previous email for reference. News has now emerged that the entire HES database may have been uploaded to some form of Google cloud storage. Would you also please assess this for compliance with the DPA? I am particularly concerned to know whether it was in compliance with the first, seventh and eighth data protection principle. This piece refers to the alleged upload to Google servers http://t.co/zWF2QprsTN

best wishes,
Jon

However, I’m now genuinely concerned by a statement from the ICO, in response to the news that they are to be given compulsory powers of audit of NHS bodies. They say (in the context of the GP data proposed to be uploaded under the care.data initiative)

The concerns around care.data come from this idea that the health service isn’t particularly good at looking after personal information

I’m not sure if they’re alluding to their own concerns, or the public’s, but I think the statement really misunderstands the public’s worries about care.data, and the use of medical data in general. From many, many discussions with people, and from reading more about this subject than is healthy, it seems to me that people have a general worry about, and objection to, their confidential medical information possibly being made available to commercial organisations, for the potential profit of the latter, and this concern stems from the possibility that this processing will lead to them being identified, and adversely affected by that processing. If the ICO doesn’t understand this, then I really think they need to start listening. And, that, of course, also goes for NHS England.

Jeremy Hunt will unveil new laws to ensure that medical records can only be released when there is a “clear health benefit” rather than for “purely commercial” use by insurers and other companies.

Ministers will also bolster criminal sanctions for organisations which breach data protection laws by disclosing people’s personal data. Under a “one strike and you’re out” approach, they will be permanently banned from accessing NHS data

One needs to be aware that this is just a newspaper report, and as far as I know it hasn’t been confirmed by the minister or anyone else in the government, but if it is accurate, I fear it shows further contempt for public concerns about the risks to the confidentiality of their medical records.

The first of the reported amendments sounds like a statutory backing to the current assurances that patient data will only be made available to third parties if it is for the purposes that will benefit the health and social care system (see FAQ 39 on the Guide for GP Practices). It also sounds like a very difficult piece of legislation to draft, and it will be very interesting to see what the proposed amendment actually says – will it allow secondary use for commercial purposes, as long as the primary use is for a “clear health benefit”? and, crucially, how on earth will it be regulated and enforced? (will properly resourced regulators be allowed to audit third parties’ use of data? – I certainly hope so).

The second amendment implies that the Data Protection Act 1998 (DPA) will also be amended. This also sounds like a difficult provision to draft: the Telegraph says

Those that have committed even one prior offence involving patient data will be barred from accessing NHS medical records indefinitely as part of a “one strike and you’re out” approach

But what do we mean by “offence”? The Telegraph falls into the common error of thinking that the Information Commissioner’s Office’s (ICO’s) powers to serve monetary penalty notices (MPNs) to a maximum of £500,000 are criminal justice powers; they are not – MPNs are civil notices, and the money paid is not a “fine” but a penalty. The only relevant current criminal offence in the DPA is that of (in terms) deliberately or recklessly obtaining or disclosing personal data without authority of the data controller. This is an either-way offence, which means it currently carries a maximum sanction of a £5000 fine in a magistrates court, or an unlimited fine in Crown Court (it is very rare for cases to be tried in the latter though). Prosecutions under this section (55) are generally brought against individuals, because the offence involves obtaining or disclosing the data without the authority of the data controller. It is unlikely that a company would commit a section 55 offence. More likely is that a company would seriously contravene the DPA in a manner which would lead to a (civil) MPN, or more informal ICO enforcement action. More likely still is simply that the ICO would have made a finding of “unlikely to have complied” with the DPA, under section 42 – a finding which carries little weight. Are prior civil or informal action, or a section 42 “unlikely to have complied” assessment going to count for the “one strike and you’re out” approach? And even if they are, what is to stop miscreant individuals or companies functioning through proxies, or agents? or even simply lying to get access to the data?

Noteworthy by its absence in the Telegraph reports of the proposed amendments was any reference to the one change to data protection law which actually might have a deterrent effect on those who illegally obtain or disclose personal data – the possibility of being sent to prison. As I and others have written before, all that is needed to achieve this is for the government to commence Section 77 of the Criminal Justice and Immigration Act 2008, which would create the power to alter the penalty (including a custodial sentence) for a section 55 DPA offence. However, the government has long been lobbied by certain sections of the press industry not to do so, because of apparent fears that it would give the state the power to imprison investigative journalists (despite the fact that section 78 of the Criminal Justice Act 2008 – also uncommenced – creating a new defence for journalistic, literary or artistic purposes). The Information Commissioner has repeatedly called for the law to be changed so that there is a real sanction for serious criminal data protection offences, but to no avail.

Chris Pounder has argued that the custodial sentence provisions (discussion of which was kicked into the long grass which grew up in the aftermath of the Leveson inquiry) might never be introduced. Despite the calls for such strong penalties for misuse of medical data, from influential voices such as Ben Goldacre, the proposals for change outlined by the Telegraph seem to support Dr Pounder’s view.

One of the main criticisms of the disastrous public relations and communications regarding the care.data initiative is that people’s acute concerns about the security of their medical records have been dismissed with vague or misleading reassurances. With the announcement of these vague and probably ineffectual proposed legal sanctions, what a damned shame that that looks to be continuing.

I’ve asked the ICO to assess whether the sale of millions of health records to insurance companies so that they could “refine” their premiums was compliant with the law

I’m about to disclose some sensitive personal data: I have been to hospital a few times over recent years…along with 47 million other people, whose records from these visits, according to reports in the media, were sold to an actuarial society for insurance premium purposes. The Telegraph reports

a report by a major UK insurance society discloses that it was able to obtain 13 years of hospital data – covering 47 million patients – in order to help companies “refine” their premiums.

As a result they recommended an increase in the costs of policies for thousands of customers last year. The report by the Staple Inn Actuarial Society – a major organisation for UK insurers – details how it was able to use NHS data covering all hospital in-patient stays between 1997 and 2010 to track the medical histories of patients, identified by date of birth and postcode.

I don’t know if this use of my sensitive personal data (if it was indeed my personal data) was in compliance with the Data Protection Act 1998 (DPA), although sadly I suspect that it was, but section 42 of the DPA allows a data subject to request the Information Commissioner to make an assessment as to whether it is likely or unlikely that the processing has been or is being carried out in compliance with the provisions of the DPA. So that’s what I’ve done:

Hi

As a data subject with a number of hospital episodes over recent years I am disturbed to hear that the Hospital Episode Statistics (HES) of potentially 47 million patients were disclosed to Staple Inn Actuarial Society (SIAS), apparently for the purposes of helping insurance companies “refine” their premiums. I became aware of this through reports in the media (e.g. http://www.telegraph.co.uk/health/healthnews/10656893/Hospital-records-of-all-NHS-patients-sold-to-insurers.html). I am asking, pursuant to my right under section 42 of the Data Protection Act 1998, the ICO to assess whether various parts of this process were in compliance with the relevant data controllers’ obligations under the DPA:

1) I was not aware, until relatively recently, that HESs were provided to the HSCIC – was this disclosure by hospitals compliant with their DPA obligations?

2) Was the general processing (e.g. retention, manipulation, anonymisation, pseudonymisation) of this personal data compliant with HSCIC’s or, to the extent that HSCIC is a data processor to NHS England’s data controller, NHS England’s DPA obligations?

3) Was the disclosure of what appears to have been sensitive personal data (I note the broad definition of “personal data”, and your own guidance on anonymisation) to SIAS compliant with HSCIC’s (or NHS England’s) DPA obligations

4) Was SIAS’s subsequent processing of this sensitive personal data compliant with its DPA obligations?

You will appreciate that I do not have access to some information, so it may be that when I refer to HSCIC or NHS England or SIAS I should refer to predecessor organisations.

Please let me know if you need any further information to make this assessment.

with best wishes, Jon Baines

We’ve been told on a number of occasions recently that we shouldn’t be worried about our GP records being uploaded to HSCIC under the care.data initiative, because our hospital records have been used in this way for so long. Clare Gerada, former Chair of the Council of the Royal College of General Practitioners wrote in the BMJ that

for 25 years, hospital data have been handled securely with a suite of legal safeguards to protect confidentiality—the exact same safeguards that will continue to be applied when primary care data are added

Well, it seems to me that those legal safeguards might have failed to prevent (indeed, might have actively permitted) a breach involving 47 million records. I’m very interested to know what the Information Commissioner’s assessment will be.

UPDATE: 24 February 2014

An ICO spokesperson later said:

“We’re aware of this story, and will be gathering more information – specifically around whether the information had been anonymised – before deciding what action to take.”

The HSCIC believes greater scrutiny should have been applied by our predecessor body prior to an instance where data was shared with an actuarial society

UPDATE: 27 February 2014

GP and Clinical Lecturer Anne Marie Cunningham has an excellent post on what types of data were apparently disclosed by NHSIC (or HSCIC), and subsequently processed by, or on behalf, of SIAS. I would recommend reading the comments as well. It does seems to me that we may still be talking about pseudonymised personal data, which would mean that the relevant data controllers still had obligations under the DPA, and the ICO would have jurisdiction to investigate, and, if necessary, take regulatory action.

The ICO has kindly acknowledged receipt of my request for assessment, saying it has been passed to their health sector team for “further detailed consideration”.

UPDATE: 24 May 2014

Er, there is no real update. There was a slight hiccup, when the ICO told me it was not making an assessment because “[it] is already aware of this issue and is investigating them accordingly. Given that we do not necessarily require individual complaints to take consider taking further action your case is closed”. After I queried the legal basis for failing to make a section 42 assessment as requested, the position was “clarified”:

…we will make an assessment in relation to this case, however we are unable to do so at this present time…This is because the office is currently investigating whether, as alleged in the media, actual personal data has been shared by the HSCIC to various other organisations including Staple Inn, PA consulting and Google

I don’t criticise the ICO for taking its time to investigate: it involves a complicated assessment of whether the data disclosed was personal data. In a piece I wrote recently for the Society of Computers and Law I described the question of whether data is anonymous or not as a “profound debate”. And it is also highly complex. But what this delay, in assessing just one aspect of health data disclosure, does show, is that the arbitrary six-month delay to the implementation of care.data was never going to be sufficient to deal with all the issues, and sufficiently assure the public, and medical practitioners, to enable it to proceed. A vote on 23 May by the BMA’s Local Medical Committee’s conference emphatically illustrates this.