Category Archives: Europe

The Court of Justice of the EU decided in Case C-210/16 Wirtschaftsakademie that Facebook and the administrator of a fan page created on Facebook are joint controllers under EU data protection law. The decision sent a mini shockwave to organizations that use Facebook Pages, just one week after the GDPR entered into force. What exactly does it mean that they are joint controllers and what exactly do they have to do in order to be compliant? The judgment leaves these questions largely unanswered, but it gives some clues as to finding answers.

Being a joint controller means they have a shared responsibility (with Facebook) to comply with EU data protection law for the processing of personal data occurring through their Facebook Page. As the Court highlighted, they have this responsibility even if they do not have access at all to personal data collected through cookies placed on the devices of visitors of the Facebook page, but just to the aggregated results of the data collection.

The judgment created a great deal of confusion. What has not been yet sufficiently emphasized in the reactions to the Wirtschaftsakademie judgment is that this shared responsibility is not equal: it depends on the stage of the processing the joint controller is involved in and on the actual control it has over the processing. This is, in any case, a better position to be in rather than “controller” on behalf of whom Facebook is processing personal data, or “co-controller” with Facebook. This would have meant full legal liability for complying with data protection obligations for the personal data processed through the page. It is, however, a worse position than being a third party or a recipient that is not involved in any way in establishing purposes and means of the processing. That would have meant there is no legal responsibility for the data being processed through the page. Technically, those were the other options the Court probably looked at before taking the “joint controllership” path.

It is important to note that the Court did not mention at all which are the responsibilities of whom – not even with regard to providing notice. The failure of both Facebook and the page administrator to inform visitors about cookies being placed on their device was the reason invoked by the DPA in the main national proceedings, but the Court remained silent on who is responsible for this obligation.

This summary looks at what the Court found, explaining why it reached its conclusion, and trying to carve out some of the practical consequences of the judgment (also in relation to the GDPR).

This first part of the commentary on the judgment will only cover the findings related to “joint controllership”. The findings related to the competence of the German DPA will be analyzed in a second part. While the judgment interprets Directive 95/46, most of the findings will remain relevant under the GDPR as well, to the extent they interpret identical or very similar provisions of the two laws.

Facts of the Case

Wirtschaftsakademie is an organization that offers educational services and has a Facebook fan page. The Court described that administrators of fan pages can obtain anonymous statistical information available to them free of charge. “That information is collected by means of evidence files (‘cookies’), each containing a unique user code, which are active for two years and are stored by Facebook on the hard disk of the computer or on other media of visitors to fan pages” (#15). The user code “is collected and processed when the fan pages are open” (#15).

The DPA of Schleswig-Holstein ordered Wirtschaftsakademie to close the fan page if it will not be brought to compliance, on the ground that “neither Wirtschaftsakademie, nor Facebook, informed visitors to the Fan Page that Facebook, by means of cookies, collected personal data concerning them and then processed the data” (#16).

The decision of the DPA was challenged by Wirtschaftsakademie, arguing that “it was not responsible under data protection law for the processing of the data by Facebook or the cookies which Facebook installed” (#16). After the DPA lost in lower instances, it appealed these solutions to the Federal Administrative Court, arguing that the main data protection law breach of Wirtschafstakademie was the fact that it commissioned “an inappropriate supplier” because the supplier “did not comply with data protection law” (#22).

The Federal Administrative Court sent several questions for a preliminary ruling to the CJEU aiming to clarify whether indeed Wirtschaftsakademie had any legal responsibility for the cookies placed by Facebook through its Fan Page and whether the Schleswig Holstein DPA had competence to enforce German data protection law against Facebook, considering that Facebook’s main establishment in the EU is in Ireland and its German presence is only linked to marketing (#24).

“High level of protection” and “effective and complete protection”

The Court starts its analysis by referring again to the aim of the Directive to “ensure a high level of protection of fundamental rights and freedoms, and in particular their right to privacy in respect to processing of personal data” (#26) – and it is to be expected that all analyses under the GDPR would start from the same point. This means that all interpretation of the general data protection law regime will be done in favor of protecting the fundamental rights of data subjects.

Based on the findings in Google Spain, the Court restates that “effective and complete protection of the persons concerned” requires a “broad definition of controller” (#28). Effective and complete protection is another criterion that the Court often takes into account when interpreting data protection law in favor of the individual and his or her rights.

{In fact, one of the afterthoughts of the Court after establishing the administrator is a joint controller, was that “the recognition of joint responsibility of the operator of the social network and the administrator of a fan page hosted on that network in relation to the processing of the personal data of visitors to that page contributes to ensuring more complete protection of the rights of persons visiting a fan page” (#42)}.

The referring Court did not even consider the possibility that the administrator is a controller

Having set up the stage like this, the Court goes on and analyzes the definition of “controller”. To be noted, though, that the referring Court never asked whether the administrator of the fan page is a controller or a joint controller, but asked whether it has any legal responsibility for failing to choose a compliant “operator of its information offering” while being an “entity that does not control the data processing within the meaning of Article 2(d) of Directive 95/46” (#24 question 1).

It seems that the referring Court did not even take into account that the fan page administrator would have any control over the data, but was wondering whether only “controllers” have legal responsibility to comply with data protection law under Directive 95/46, or whether other entities somehow involved in the processing could also have some responsibility.

However, the Court does not exclude the possibility that the administrator may be a controller. First of all, it establishes that processing of personal data is taking place, as described at #15, and that the processing has at least one controller.

Facebook is “primarily” establishing means and purposes of the processing

It recalls the definition of “controller” in Article 2(d) of the Directive and highlights that “the concept does not necessarily refer to a single entity and may concern several actors taking part in that processing, with each of them then being subject to the applicable data protection provisions” (#29). The distribution of responsibilities from the last part of the finding is brought up by the Court without having any such reference in Article 2(d)[1].

This is important, because the next finding of the Court is that, in the present case, “Facebook Ireland must be regarded as primarilydetermining the purposes and means of processing the personal data of users of Facebook and persons visiting the fan pages hosted on Facebook” (#30). Reading this paragraph together with #29 means that Facebook will have a bigger share of the obligations in a joint controllership situation with fan pages administrators.

This idea is underlined by the following paragraph which refers to identifying the “extent” to which a fan page administrator “contributes… to determining, jointly with Facebook Ireland and Facebook Inc., the purposes and means of processing” (#31). To answer this question, the Court lays out its arguments in three layers:

1) It describes the processing of personal data at issue, mapping the data flows – pointing to the personal data being processed, data subjects and all entities involved:

The data processing at issue (placing of cookies on the Fan Page visitors’ device) is “essentially carried out by Facebook” (#33);

Facebook “receives, registers and processes” the information stored in the placed cookies not only when a visitor visits the Fan Page, but also when he or she visits services provided by other Facebook family companies and by “other companies that use the Facebook services” (#33);

Facebook partners and “even third parties” may use cookies to provide services to Facebook or the business that advertise on Facebook (#33);

The creation of a fan page “involves the definition of parameters by the administrator, depending inter alia on the target audience … , which has an influence on the processing of personal data for the purpose of producing statistics based on visits to the fan page” (#36);

The administrator can request the “processing of demographic data relating to its target audience, including trends in terms of age, sex, relationship and occupation”, lifestyle, location, online behavior, which tell the administrator where to make special offers and better target the information it offers (#37);

The audience statistics compiled by Facebook are transmitted to the administrator “only in anonymized form” (#38);

The production of the anonymous statistics “is based on the prior collection, by means of cookies installed by Facebook …, and the processing of personal data of (the fan page) visitors for such statistical purposes” (#38);

2) It identifies the purposes of this processing:

There are two purposes of the processing:

“to enable Facebook to improve its system of advertising transmitted via its network” and

“to enable the fan page administrator to obtain statistics produced by Facebook from the visits of the page”, which is useful for “managing the promotion of its activity and making it aware of the profiles of the visitors who like its fan page or use its applications, so that it can offer them more relevant content” (#34);

3) It establishes a connection between the two entities that define the two purposes of processing:

Creating a fan page “gives Facebook the opportunity to place cookies on the computer or other device of a person visiting its fan page, whether or not that person has a Facebook account” (#35);

The administrator may “define the criteria in accordance with which the statistics are to be drawn up and even designate the categories of persons whose personal data is to be made use of by Facebook”, “with the help of filters made available by Facebook” (#36);

Therefore, the administrator “contributes to the processing of the personal data of visitors to its page” (#36);

One key point: not all joint controllers must have access to the personal data being processed

In what is the most impactful finding of this judgment, the Court uses one of the old general principles of interpreting and applying the law, ubi lex non distinguit, nec nos distinguere debemus, and it states that “Directive 95/46 does not, where several operators are jointly responsible for the same processing, require each of them to have access to the personal data concerned” (#38). Therefore, the fact that administrators have access only to anonymized data will have no impact upon the existence of their legal responsibility as joint controllers, since the criteria that matters is establishing purposes and means of the processing and that at least one of the entities involved in the processing has access to and is processing personal data. The fact that they only have access to anonymized data should nonetheless matter when establishing the degree of responsibility.

Hence, after describing the involvement of fan page administrators in the processing at issue – and in particular their role in defining parameters for processing depending on their target audience and in the determination of the purposes of the processing, the Court finds that “the administrator must be categorized, in the present case, as a controller responsible for that processing within the European Union, jointly with Facebook Ireland” (#39).

Enhanced responsibility for non-users visiting the page

The Court also made the point that fan pages can be visited by non-users of Facebook, implying that were it not for the existence of that specific fan page they accessed because they were looking for information related to the administrator of the page, Facebook would not be able to place cookies on their devices and process personal data related to them for its own purposes and for the purposes of the fan page. “In that case, the fan page responsibility for the processing of the personal data of those persons appears to be even greater, as the mere consultation of the home page by visitors automatically starts the processing of their personal data” (#42).

Jointly responsible, not equally responsible

Finally, after establishing that there is joint controllership and joint responsibility, the Court makes the very important point that the responsibility is not equal and it depends on the degree of involvement of the joint controller in the processing activity:

“The existence of joint responsibility does not necessarily imply equal responsibility of the various operators involved in the processing of personal data. On the contrary, those operators may be involved at different stages of that processing of personal data and to different degrees, so that the level of responsibility of each of them must be assessed with regard to all the relevant circumstances of the particular case” (#43).

Comments and conclusions

In the present case, the Court found early in the judgment that Facebook “primarily” establishes the means and purposes of the processing. This means that it is primarily responsible for compliance with data protection obligations. At the same time, the administrator of the fan page has responsibility to comply with some data protection provisions, as joint controller. The Court did not clarify, however, what exactly the administrator of the fan page must do in order to be compliant.

For instance, the Court does not go into analyzing how the administrator complies or not with the Directive in this case – therefore, assuming that the judgment requires administrators to provide data protection notice is wrong. The lack of notice was a finding of the DPA in the initial proceedings. Moreover, the DPA ordered Wirtschaftsakademie to close its Facebook page because it found that neither Facebook, nor the page administrator had informed visitors about the cookies being placed on their devices (#16).

The CJEU merely establishes that the administrator is a joint controller and that it shares responsibility for compliance with Facebook depending on the degree of their involvement in the processing.

The only clear message from the Court with regard to the extent of legal responsibility of the administrator as joint controller is that it has enhanced responsibility towards visitors of the fan page that are not Facebook users. This being said, it is very likely that informing data subjects is one of the obligations of the GDPR that can potentially fall on the shoulders of fan page administrators in the absence of Facebook stepping up and providing notice, since they can edit the interface with visitors to a certain extent.

Another message that is not so clear, but can be extracted from the judgment is that the degree of responsibility of the joint controllers “must be assessed with regard to all the relevant circumstances of the particular case” (#43). This could mean that if the two joint controllers were to enter a joint controllership agreement (as the GDPR now requires), the Courts and DPAs may be called to actually look at the reality of the processing in order to determine the responsibilities each of them has, in order to avoid a situation where the joint controller primarily responsible for establishing means and purposes contractually distributes obligations to the other joint controller that the latter could not possibly comply with.

As for the relevance of these findings under the GDPR, all the “joint controllership” part of the judgment is very likely to remain relevant, considering that the language the Court interpreted from Directive 95/46 is very similar to the language used in the GDPR (see Article 2(d) of the Directive and Article 4(7) GDPR). However, the GDPR does add a level of complexity to the situation of joint controllers, in Article 26. The Court could, eventually, add to this jurisprudence an analysis of the extent to which the joint controllership agreement required by Article 26 is relevant to establish the level of responsibility of a joint controller.

Given that the GDPR requires joint controllers to determine in a transparent manner their respective responsibilities for compliance through an arrangement, one consequence of the judgment is that such an arrangement should be concluded between Facebook and fan page administrators (Article 26(1) GDPR). The essence of the arrangement must then be made available to visitors of fan pages (Article 26(2) GDPR).

However, there is one obligation under the GDPR that, when read together with the findings of the Court, results in a conundrum. Article 26(3) GDPR provides that the data subject may exercise his or her rights “in respect of and against each of the controller”, regardless of how the responsibility is shared contractually between them. In the case at hand, the Court acknowledges that the administrator only has access to anonymized data. This means that even if data subjects would make, for example, a request for access or erasure of data to the administrator, it will not be in a position to solve such requests. A possibility is that any requests made to a joint controller that does not have access to data will be forwarded by the latter to the joint controller that does have access (what is important is that the data subject has a point of contact and eventually someone they can claim their rights to). This is yet another reason why a written agreement to establish the responsibility of each joint controller is useful. Practice will solve the conundrum, ultimately, with DPAs and national Courts likely playing their part.

[1] “(d) ‘controller’ shall mean the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data; where the purposes and means of processing are determined by national or Community laws or regulations, the controller or the specific criteria for his nomination may be designated by national or Community law;”

The nature of the digital economy is as such that it will force the creation of multi-competent supervisory authorities sooner rather than later. What if the European Data Protection Board would become in the next 10 to 15 years an EU Digital Regulator, looking at matters concerning data protection, consumer protection and competition law, having “personal data” as common thread? This is the vision Giovanni Buttarelli, the European Data Protection Supervisor, laid out last week in a conversation we had at the IAPP Data Protection Congress in Brussels.

The conversation was a one hour session in front of an over-crowded room in The Arc, a cozy amphitheater-like venue inducing bold ideas being expressed in a stimulating exchange.

To begin with, I reminded the Supervisor that at the very beginning of his mandate, in early 2015, he published the 5-year strategy of the EDPS. At that time the GDPR wasn’t adopted yet and the Internet of Things was taking off. Big Data had been a big thing for a while and questions about the feasibility and effectiveness of a legal regime that is centered around each data item that can be traced back to an individual were popping up. The Supervisor wrote in his Strategy that the benefits brought by new technologies should not happen at the expense of the fundamental rights of individuals and their dignity in the digital society.

“Big data will need equally big data protection“, he wrote then, suggesting thus that the answer to Big Data is not less data protection, but enhanced data protection.

I asked the Supervisor if he thinks that the GDPR is the “big data protection” he was expecting or whether we need something more than what the GDPR provides for. And the answer was that “the GDPR is only one piece of the puzzle”. Another piece of the puzzle will be the ePrivacy reform, and another one will be the reform of the regulation that provides data protection rules for the EU institutions and that creates the legal basis for the functioning of the EDPS. I also understood from our exchange that a big part of the puzzle will be effective enforcement of these rules.

The curious fate of the European Data Protection Board

One centerpiece of enforcement is the future European Data Protection Board, which is currently being set up in Brussels so as to be functional on 25 May 2018, when the GDPR becomes applicable. The European Data Protection Board will be a unique EU body, as it will have a European nature, being funded by the EU budget, but it will be composed of commissioners from national data protection authorities who will adopt decisions, that will rely for the day-to-day activity on a European Secretariat. The Secretariat of the Board will be ensured by dedicated staff of the European Data Protection Supervisor.

The Supervisor told the audience that he either already hired or plans to hire a total of “17 geeks” adding to his staff, most of whom will be part of the European Data Protection Board Secretariat. The EDPB will be functional from Day 1 and, apparently, there are plans for some sort of inauguration of the EDPB celebrated at midnight on the 24th to the 25th of May next year.

These are my thoughts here: the nature of the EDPB is as unique as the nature of the EU (those of you who studied EU Law certainly remember from the law school days how we were told that the EU is a sui generis type of economical and political organisation). In fact, the EDPB may very well serve as test model for ensuring supervision and enforcement of other EU policy areas. The European Commission could test the waters to see whether such a mixt national/European enforcement mechanism is feasible.

There is a lot of pressure on effective enforcement when it comes to the GDPR. We dwelled on enforcement, and one question that inevitably appeared was about the trend that starts to shape up in Europe, of having competition authorities and consumer protection authorities engaging in investigations together with, or in parallel with data protection authorities (see here – here and here).

“It’s time for a big change, and time for the EU to have a global approach“, the Supervisor said. And a change that will require some legislative action. “I’m not saying we will need an European FTC (US Federal Trade Commission – n), but we will need a Digital EU Regulator“, he added. This Digital Regulator would have the powers to also look into competition and consumer protection issues raised by processing of personal data (so, therefore, in addition to data protection issues). Acknowledging that these days there is a legislative fatigue in Brussels surrounding privacy and data protection, the Supervisor said he will not bring this idea to the attention of the EU legislator right now. But he certainly plans to do so, maybe even as soon as next year. The Supervisor thinks that the EDPB could morph into this kind of Digital Regulator sometime in the future.

Enhanced global enforcement initiatives

Another question that had to be asked on enforcement was whether we should expect more concentrated and coordinated action of privacy commissioners on a global scale, in GPEN-like structures. The Supervisor revealed that the privacy commissioners that meet for the annual International Conference are “trying to complete an exercise about our future”. They are currently analyzing the idea of creating an entity with legal personality that will look into global enforcement cases.

Ethics comes on top of legal compliance

Another topic the conversation went to was “ethics”. The EDPS has been on the forefront of including the ethics approach in privacy and data protection law debates, by creating the Ethics Advisory Group at the beginning of 2016. I asked the Supervisor whether there is a danger that, by bringing such a volatile concept into the realm of data protection, companies would look at this as an opportunity to circumvent strict compliance and rely on sufficient self-assessments that their uses of data are ethical.

“Ethics comes on top of data protection law implementation”, the Supervisor explained. According to my understanding, ethics is brought into the data protection realm only after a controller or processor is already compliant with the law and, if they have to take equally legal decisions, they should rely on ethics to take the right decision.

We did discuss about other things during this session, including the 2018 International Conference of Privacy Commissioners that will take place in Brussels, and the Supervisor received some interesting questions from the public at the end, including about the Privacy Shield. But a blog can only be this long.

Note: The Supervisor’s quotes are so short in this blog because, as the moderator, I did my best to follow the discussion and steer it rather than take notes. So the quotes come from the brief notes I managed to take during this conversion.

The Spanish Data Protection Authority announced today that they fined Facebook with 1,2 million euro for several breaches of the Spanish Data Protection Law. Here’s a brief note in English from Politico.eu and the full press release of the Spanish DPA (in ES).

As per my knowledge, this is the biggest fine issued by a Data Protection Authority in Europe for breaches of data protection law (as always, please correct me in comments below and I will make the changes. UPDATE: It’s worth noting that the Italian Garante, in an investigation conducted in conjunction with Guarda de Finanza – a specialised body inquiring financial criminal conduct, issued in February this year a total sum fine of 5.8 mil euro to a company that was transferring money from Italy to China on behalf of persons without their knowledge, which also meant that they were processing personal data without consent. The total sum fine was reached by adding fines for unlawfully processing data of every person affected).

According to the press release (Please note that all quotes are unofficialtranslation, made by me, so they must not be relied on for legal advice. UPDATE: An official press release is now available in English):

Personal data on political views, religious beliefs, sex, personal preferences or location data are collected directly, via mere interaction of the data subject with Facebook services or with third-party webpages, without clearly informing the user about the use and the purposes of collecting this data.

Facebook does not obtain unequivocal consent, specific and informed, from users to process their data, because it does not properly inform data subjects.

Each of the serious breach was fined with 300.000 EUR and the very serious breach was fined with 600.000 EUR.

The very serious breach was that “the social network processes special categories of data for marketing purposes, among others, without obtaining explicit consent of users, as requested by the data protection law”.

“The investigation allowed to prove that Facebook does not inform users in an exhaustive and clear manner about the data that they are going to collect and the processing operations they are going to engage in with that data, limiting themselves to only giving some examples. In particular, the social network collects other data derived from the interaction carried out by users, both on the platform itself and on third-party websites, without them being able to clearly perceive the data that Facebook collects about them, or the purposes for which the data is collected”, according to the press release.

The DPA also took into account that “users are not informed on how their data are processed through the use of cookies – some of them used exclusively for marketing purposes and some of them used for a purpose that the company categorised as “secret”, when they are accessing web pages that are not of the company but that contain the “Like” button”. The DPA mentions as well the situation of users that are not registered with the social platform, but visit at one point one of the platform’s pages – their data is also retained by the social network.

The DPA also found that “the privacy policy contains general formulations that are not clear, and it obliges the user to access a multitude of links to be able to read it”. On one hand, the DPA notes, a Facebook user with an average knowledge of how new technology works is not able to acknowledge to the full extent the collection of data, how it’s subsequently used, or why it is used. On the other hand, the non-users are not at all able to be aware of how they’re data is used.

Finally, the DPA also referred to the fact they were able to prove that Facebook does not delete data that it collects on the basis of online browsing habits of users, retaining it and reutilising it associated with the same user. “Concerning data retention, when a user deletes their account and asks for deletion of data, Facebook retains and processes data for another 17 months through a cookie. This is why the DPA considers that the personal data of users are not completely deleted neither when they stop being necessary for the purposes they were collected, nor when the user explicitly require their deletion“.

This decision comes to show, yet again, how important transparency is towards the data subject! As you will also see soon in my commentary of the Barbulescu v Romania judgment of the ECHR Grand Chamber of last week, correctly and fully informing the data subject is key to data protection compliance.

AG Kokott delivered her Opinion on 20 July in Case C-434/16 Nowak v Data Protection Commissioner, concluding that “a handwritten examination script capable of being ascribed to an examination candidate, including any corrections made by examiners that it may contain, constitutes personal data within the meaning of Article 2(a) of Directive 95/46/EC” (Note: all highlights in this post are mine).
This is a really exciting Opinion because it provides insight into:

how the same ‘data item’ can be personal data of two distinct data subjects (examiners and examinees),

what constitutes a “filing system” of personal data processed otherwise than by automated means.

But also because it technically (even if not literally) invites the Court to change its case-law on the definition of personal data, and specifically the finding that information consisting in a legal assessment of facts related to an individual does not qualify as personal data (see C-141/12 and C-372/12 YS and Others).

The proceedings were initially brought in front of the Irish Courts by Mr Nowak, who, after failing an exam organised by a professional association of accountants (CAI) four times, asked for access to see his exam sheet on the basis of the right to access his own personal data. Mr Nowak submitted a request to access all his personal data held by CAI and received 17 items, none of which was the exam sheet. He then submitted a complaint to the Irish Data Protection Commissioner, who decided not to investigate it, arguing that an exam sheet is not personal data. The decision not to investigate on this ground was challenged in front of a Court. Once the case reached the Irish Supreme Court, it was referred to the Court of Justice of the EU to clarify whether an exam sheet falls under the definition of “personal data” (§9 to §14).

Analysis relevant both for Directive 95/46 and for the GDPR

Yet again, AG Kokott refers to the GDPR in her Conclusions, clarifying that “although the Data Protection Directive will shortly be repealed by the General Data Protection Regulation, which is not yet applicable, the latter will not affect the concept of personal data. Therefore, this request for a preliminary ruling is also of importance for the future application of the EU’s data protection legislation” (m.h.).

The nature of an exam paper is “strictly personal and individual”

First, the AG observes that “the scope of the Data Protection Directive is very wide and the personal data covered by the Directive is varied” (§18).

The Irish DPC argued that an exam script is not personal data because “examination exercises are normally formulated in abstract terms or relate to hypothetical situations”, which means that “answers to them are not liable to contain any information relating to an identified or identifiable individual” (§19).

This view was not followed by the AG, who explained that it is incongruent with the purpose of an exam. “In every case“, she wrote, “the aim of an examination — as opposed, for example, to a representative survey — is not to obtain information that is independent of an individual. Rather, it is intended to identify and record the performance of a particular individual, i.e. the examination candidate” (§24; m.h.). Therefore, “every examination aims to determine the strictly personal and individual performance of an examination candidate. There is a good reason why the unjustified use in examinations of work that is not one’s own is severely punished as attempted deception” (§24; m.h.).

What about exam papers identified by codes?

In a clear indication that pseudonymized data are personal data, the AG further noted that an exam script is personal data also in those cases where instead of bearing the examination candidate’s name, the script has an identification number or bar code: “Under Article 2(a) of the Data Protection Directive, it is sufficient for the existence of personal information that the data subject may at least be indirectly identified. Thus, at least where the examination candidate asks for the script from the organisation that held the examination, that organisation can identify him by means of the identification number” (§28).

Characteristics of handwriting, personal data themselves

The AG accepted the argument of Mr Nowak that answers to an exam that are handwritten “contain additional information about the examination candidate, namely about his handwriting” (&29). Therefore, the characteristics of the handwriting are personal data themselves. The AG explains that “a script that is handwritten is thus, in practice, a handwriting sample that could at least potentially be used at a later date as evidence to determine whether another text was also written in the examination candidate’s writing. It may thus provide indications of the identity of the author of the script” (§29). According to the AG, it’s not relevant whether such a handwriting sample is a suitable means of identifying the writer beyond doubt: “Many other items of personal data are equally incapable, in isolation, of allowing the identification of individuals beyond doubt” (§30).

Classifying information as ‘personal data’ is a stand alone exercise (does not depend on whether rights can be exercised)

The Irish DPC argued that one of the reasons why exam scripts are not personal data in this case is because the “purpose” of the right to access and the right to rectification of personal data precludes them to be “personal data” (§31). The DPC is concerned that Recital 41 of Directive 95/46 specifies that any person must be able to exercise the right of access to data relating to him which is being processed, in order to verify in particular the accuracy of the data and the lawfulness of the processing. “The examination candidate will seek the correction of incorrect examination answers”, the argument goes (§31).

AG Kokott rebuts this argument by acknowledging that the classification of information as personal data “cannot be dependent on whether there are specific provisions about access to this information” or on eventual problems with rectification of data (§34). “If those factors were regarded as determinative, certain personal data could be excluded from the entire protective system of the Data Protection Directive, even though the rules applicable in their place do not ensure equivalent protection but fragmentary protection at best” (§34).

Even if classification information as “personal data” would depend in any way on the purpose of the right to access, the AG makes it clear that this purpose is not strictly linked to rectification, blocking or erasure: “data subjects generally have a legitimate interest in finding out what information about them is processed by the controller” (§39). This finding is backed up by the use of “in particular” in Recital 41 of the Directive (§39).

The purpose of processing and… the passage of time, both relevant for obtaining access, rectification

After clarifying that it’s irrelevant what an individual wants to do with their data, once accessed (see also the summary below on the ‘abuse of rights’), AG Kokott explains that a legitimate interest in correcting an “exam script”-related data is conceivable.

She starts from the premise that “the accuracy and completeness of personal data pursuant to Article 6(1)(d) must be judged by reference to the purpose for which the data was collected and processed” (§35). The AG further identifies the purpose of an exam script as determining “the knowledge and skills of the examination candidate at the time of the examination, which is revealed precisely by his examination performance and particularly by the errors in the examination” (§35). “The existence of errors in the solution does not therefore mean that the personal data incorporated in the script is inaccurate”, the AG concludes (§35).

Rectification could be achieved if, for instance, “the script of another examination candidate had been ascribed to the data subject, which could be shown by means of, inter alia, the handwriting, or if parts of the script had been lost” (§36).

The AG also found that the legitimate interest of the individual to have access to their own data is strengthened by the passage of time, to the extent that their recollection of the contents of their answer is likely to be considerably weaker a few years after the exam. This makes it possible that “a genuine need for information, for whatever reasons, will be reflected in a possible request for access. In addition, there is greater uncertainty with the passing of time — in particular, once any time limits for complaints and checks have expired — about whether the script is still being retained. In such circumstances the examination candidate must at least be able to find out whether his script is still being retained” (§41).

Is Mr Nowak abusing his right of access under data protection law?

AG Kokott recalls CJEU’s case-law on “abuse of rights” and the double test required by the Court to identify whether there had been any abuse of rights in a particular case (C-423/15 Kratzer and the case-law cited there at §38 to §40), which can be summed up to (§44):

i) has the purpose of the EU legislation in question been misused?

ii) is the essential aim of the transaction to obtain an undue advantage?

The DPC submitted during the procedure that if exam scripts would be considered personal data, “a misuse of the aim of the Data Protection Directive would arise in so far as a right of access under data protection legislation would allow circumvention of the rules governing the examination procedure and objections to examination decisions” (§45).

The AG considers that “any alleged circumvention of the procedure for the examination and objections to the examination results via the right of access laid down by data protection legislation would have to be dealt with using the provisions of the Data Protection Directive” and she specifically refers to the restrictions to the right of access laid down in Article 13 of the Directive with the aim “to protect certain interests specified therein” (§46). She also points out that if restricting access to exam scripts can’t be circumscribed to those exceptions, than “it must be recognised that the legislature has given precedence to the data protection requirements which are anchored in fundamental rights over any other interests affected in a specific instance” (§47).

The AG also looks at the exceptions to the right of access under the GDPR and finds that it is more nuanced than the Directive in this regard. “First, under Article 15(4) of the regulation, the right to obtain a copy of personal data is not to adversely affect the rights and freedoms of others. Second, Article 23 of the regulation sets out the grounds for a restriction of data protection guarantees in slightly broader terms than Article 13 of the Directive, since, in particular, protection of other important objectives of general public interest of the Union or of a Member State pursuant to Article 23(1)(e) of the regulation may justify restrictions” (§48).

However, it seems that she doesn’t find the slight broadening of the scope of exemptions in the GDPR as justifying the idea of an abuse of right in this particular case.

The AG also argues that “on the other hand, the mere existence of other national legislation that also deals with access to examination scripts is not sufficient to allow the assumption that the purpose of the Directive is being misused” (§49). She concludes that even if such misuse would be conceivable, the second limb of the “abuse of rights” test would not be satisfied: “it is still not apparent where the undue advantage lies if an examination candidate were to obtain access to his script via his right of access. In particular, no abuse can be identified in the fact that someone obtains information via the right of access which he could not otherwise have obtained” (§50).

Examiner’s correction on the exam script are the examinee’s personal data and his/her own personal data at the same time

The AG looks into whether any corrections made by the examiner on the examination script are also personal data with respect to the examination candidate (a question raised by some of the parties), even though she considers that the answer will not impact the result of the main proceedings (§52, §53).

It is apparent that the facts of this case resemble the facts of YS and Others, where the Court refused extension of the right of access to the draft legal analysis of an asylum application on the grounds that that did not serve the purpose of the Data Protection Directive but would establish a right of access to administrative documents. The Court argued in YS that such an analysis “is not information relating to the applicant for a residence permit, but at most information about the assessment and application by the competent authority of the law to the applicant’s situation” (§59; see YS and Others, §40). The AG considers that only “at first glance” the cases are similar. But she doesn’t convincingly differentiate between the two cases in the arguments that follow.

However, she is convincing when explaining why the examiner’s corrections are “personal data”. AG Kokott explains that the purpose of the comments made by examiners on an exam script is “the evaluation of the examination performance and thus they relate indirectly to the examination candidate” (§61). It does not matter that the examiners don’t know the identity of the examination candidate who produced the script, as long as the candidate can be easily identified by the organisation holding the examination (§60 and §61).

The AG further adds that “comments on an examination script are typically inseparable from the script itself … because they would not have any informative value without it” (§62). And it is “precisely because of that close link between the examination script and any corrections made on it”, that “the latter also are personal data of the examination candidate pursuant to Article 2(a) of the Data Protection Directive” (§63).

In an important statement, the AG considers that “the possibility of circumventing the examination complaint procedure is not, by contrast, a reason for excluding the application of data protection legislation” (§64). “The fact that there may, at the same time, be additional legislation governing access to certain information is not capable of superseding data protection legislation. At most it would be admissible for the individuals concerned to be directed to the simultaneously existing rights of information, provided that these could be effectively claimed” (§64).

Finally, the AG points out “for the sake of completeness” that “corrections made by the examiner are, at the same time, his personal data”. AG Kokott sees the potential conflict between the right of the candidate to access their personal data and the right of the examiners to protect their personal data and underlines that the examiner’s rights “are an appropriate basis in principle for justifying restrictions to the right of access pursuant to Article 13(1)(g) of the Data Protection Directive if they outweigh the legitimate interests of the examination candidate” (§65).

The AG considers that “the definitive resolution to this potential conflict of interests is likely to be the destruction of the corrected script once it is no longer possible to carry out a subsequent check of the examination procedure because of the lapse of time” (§65).

An exam script forms part of a filing system

One last consideration made by AG Kokott is whether processing of an exam script would possibly fall outside the scope of Directive 95/46, considering that it does not seem to be processed using automated means (§66, §67).

The AG points out that the Directive also applies to personal data processed otherwise than by automated means as long as they form part of a “filing system”, even if this “filing system” is not electronically saved (§69).

“This concept covers any structured set of personal data which is accessible according to specific criteria. A physical set of examination scripts in paper form ordered alphabetically or according to other criteria meets those requirements” (§69), concludes the AG.

Conclusion. What will the Court say?

The Conclusions of AG Kokott in Nowak contain a thorough analysis, which brings several dimensions to the data protection debate that have been rarely considered by Courts – the self-standing importance of the right of access to one’s own data (beyond any ‘utilitarianism’ of needing it to obtain something else), the relevance of passage of time for the effectiveness of data protection rights, the limits of the critique that data protection rights may be used to achieve other purposes than data protection per se, the complexity of one data item being personal data of two different individuals (and the competing interests of those two individuals).

The Court will probably closely follow the Conclusions of the AG for most of the points she raised.

The only contentious point will be the classification of an examiner’s corrections as personal data of the examined candidate, because following the AG will mean that the Court would reverse its case-law from YS and Others.

If we apply the criteria developed by AG Kokott in this Opinion, it is quite clear that the analysis concerning YS and their request for asylum is personal data: the legal analysis is closely linked to the facts concerning YS and the other asylum applicants and the fact that there may be additional legislation governing access to certain information (administrative procedures in the case of YS) is not capable of superseding data protection legislation. Moreover, if we add to this the argument that access to one’s own personal data is valuable in itself and does not need to satisfy other purpose, reversing this case-law is even more likely.

The only arguable difference between this case and YS and Others is that, unlike what the AG found in §62 (“comments on an examination script are typically inseparable from the script itself… because they would not have any informative value without it”), it is conceivable that a legal analysis in general may have value by itself. However, a legal analysis of particular facts is void of value when applied to different individual facts. In this sense, a legal analysis can also be considered inseparable from the particular facts it assesses. What would be relevant in classifying it as personal data would then remain the identifiability of the person that the particularities refer to…

I was never convinced by the argumentation of the Court (or AG Sharpston for that matter) in YS and Others and I would welcome either reversing this case-law (which would be compatible with what I was expecting the outcome of YS to be) or having a more convincing argumentation as to why such an analysis/assessment of an identified person’s specific situation is not personal data. However, I am not getting my hopes high. As AG Kokott observed, the issue in the main proceedings can be solved without getting into this particular detail. In any case, I will be looking forward to this judgement.

The draft Report prepared by MEP Marju Lauristin for the LIBE Committee containing amendments to the ePrivacy Regulation was published last week on the website of the European Parliament.

The MEP announced she will be presenting the Report to her colleagues in the LIBE Committee on 21 June. The draft Report will need to be adopted first by the LIBE Committee and at a later stage by the Plenary of the European Parliament. The Parliament will then sit in the trilogue together with the European Commission and the Council (once it will also adopt an amended text), finding the compromise among the three versions of the text.

Overall, the proposed amendments strengthen privacy protections for individuals. The big debate of whether there should be an additional exemption to confidentiality of communications based on the legitimate interest of service providers and other parties to have access to electronic communications data was solved in the sense that no such exemption was proposed (following calls in this sense by the Article 29 Working Party, the European Data Protection Supervisor and a team of independent academics). The draft report also contains strong wording to support end-to-end encryption, as well as support for Do-Not-Track technology and a new definition of the principle of confidentiality of communications in the age of the Internet of Things.

Without pretending this is a comprehensive analysis, here are 20 points that caught my eye after a first reading of the amendments (added text is bolded and italicised):

1) Clarity regarding what legitimate grounds for processing prevail if both the GDPR and the ePrivacy Reg could apply to a processing operation: those of the ePrivacy Reg. (“Processing of electronic communications data by providers of electronic communications services should only be permitted in accordance with, and on a legal ground specifically provided for under, this Regulation” – Recital 5). The amendment to Recital 5 further clarifies the relationship between the GDPR and the ePrivacy Reg, specifying that the ePrivacy Reg “aims to provide additional and complementary safeguards taking into account the need for additional protection as regards the confidentiality of communications”.

2) The regulation should be applicable not only to information “related to” the terminal equipment of end-users, but also to information “processed by” it. (“…and to information related to or processed by the terminal equipment of end-users” – Article 2; see also the text proposed for Article 3(1)(c)). This clarifies the material scope of the Regulation, leaving less room for interpretation of what “information related to” means.

3) The link to the definitions of the Electronic Communications Code is removed. References to those definitions are replaced by self-standing definitions for “electronic communications network”, “electronic communications service”, “interpersonal communications service”, “number-based interpersonal communications service”, “number -independent interpersonal communications service”, “end-user”. For instance, the new definition proposed for “electronic communications service” is “a service provided via electronic communications networks, whether for remuneration or not, which encompasses one or more of the following: an ‘internet access service’ as defined in Article 2(2) or Regulation (EU) 2015/2120; an interpersonal communications service; a service consisting wholly or mainly in the conveyance of the signals, such as a transmission service used for the provision of a machine-to-machine service and for broadcasting, but excludes information conveyed as part of a broadcasting service to the public over an electronic communications network or service except to the extent that the information can be related to the identifiable subscriber or user receiving the information” (Amendment 49; my underline).

4) Limitation of the personal scope of key provisions of the Regulation to natural persons. The draft report proposes two definitions to delineate the personal scope of the Regulation – “end-users” and “users”. While an “end-user” is defined as “a legal entity or a natural person using or requesting a publicly available electronic communications service“, a “user” is defined as “any natural person using a publicly available electronic communications service (…)“. Key provisions of the Regulation are only applicable to users, and especially the proposed principle of confidentiality of communications. (See Amendments 58 and 59). This proposal may unnecessarily limit the scope of application of the right to respect for private life, which, as opposed to the right to the protection of personal data, is theoretically (the CJEU did not yet explicitly state this) recognised as also protecting the privacy and confidentiality of communications of legal persons (through correspondence with Article 8 ECHR and how it has been interpreted by the European Court of Human Rights; for an analysis, see p. 17 and following HERE). The current ePrivacy Directive equally protects the confidentiality of communications of both natural and legal persons.

5) Enhanced definition of “electronic communications metadata”, to also include “data broadcasted or emitted by the terminal equipment to identify users’ communications and/or the terminal equipment or its location and enable it to connect to a network or to another device“.

6) Enhanced definition of “direct marketing”, to also include advertising in video format,in addition to the written and oral formats, and advertising served or presented to persons, not only “sent”. Could this mean that the definition of direct marketing will cover street advertising panels reacting to the passer-by? Possibly.

7) Extension of the principle of confidentiality of communications to machine-to-machine communications. A new paragraph is added to Article 5 (Amendment 59) “Confidentiality of electronic communications shall also include terminal equipment and machine-to-machine communications when related to a user”.

8) “Permitted” processing of electronic communications data is replaced by “lawful” processing. This change of wording de-emphasises the character of “exemptions to a principle” that the permitted processing had relative to the general principle of confidentiality. This may have consequences when Courts will interpret the law.

9) While proposed wording for the existing lawful grounds for processing is stricter (processing is allowed “only if”; necessity is replaced with “technically strict necessity”), there are additional grounds for processing added (See Amendments 64 to 66, to Article 6; see also Amendments 77, 79, 80 to Article 8).

10) A “household exemption” is introduced, similar to the one provided for by the GDPR, enhanced with a “work purposes exemption”: “For the provision of a service explicitly requested by a user of an electronic communications service for their purely individual or individual work related usage (…)“. In such circumstances, electronic communications data may be processed “without the consent of allusers”, but “only where such requested processing produces effects solely in relation to the user who requested the service“and “does not adversely affect the fundamental rights of another user or users“. This exemption raises some questions and the first one is: does anyone use an electronic communications service for purposes other than “purely individual” or “work related” purposes? If you think so, leave a comment with examples. Another question is what does “without the consent of all users” mean (See Amendment 71, to Article 6).

11) An exception for tracking employees is included in the proposal. The collection of information from user’s terminal equipment (for instance, via cookies) would be permitted “if it is necessary in the context of employment relationships“, but only to the extent the employee is using equipment made available by the employer and to the extent this monitoring “is strictly necessary for the functioning of the equipment by the employee” (see Amendment 82). It remains to be seen what “functioning of the equipment by the employee” means. This exemption seems to have the same effect as the one in Article 8(1)(a), which allows such collection of information if “it is strictly technically necessary for the sole purpose of carrying out the transmission of an electronic communication over an electronic communications network”. On another hand, it should be kept in mind that the ePrivacy rules are not intended to apply to closed groups of end-users, such as corporate intranet networks, access to which is limited to members of an organisation (see Recital 13 and Amendment 11).

12) Consent for collecting information from terminal equipment “shall not be mandatory to access the service”. This means, for instance, that even if users do not consent to placing cookies tracking their activity online, they should still be allowed to access the service they are requesting. While this would be considered a consequence of “freely given” consent, enshrining this wording in a legal provision certainly leaves no room for interpretation (see Amendment 78 to Article 8). Moreover, this exception for collecting information is strengthened by a rule introduced as a separate paragraph of Article 8, according to which “No user shall be denied access to any information society service or functionality, regardless of whether this service is remunerated or not, on grounds that he or she has not given his or her consent under Article 8(1)(b) to the processing of personal information and/or the use of storage capabilities of his or her terminal equipment that is not necessary for the provision of that service or functionality.” (see Amendment 83). Such wording would probably put to rest concerns that personal data would be considered as “counter-performance” (equivalent to money) for services.

13) All further use of electronic communications data collected under ePrivacy rules is prohibited.A new paragraph inserted in Article 6 simply states that “Neither providers of electronic communications services, nor any other party, shall further process electronic communications data collected on the basis of this Regulation” (see Amendment 72).

14) Wi-fi tracking and similar practices involving collection of information emitted by terminal equipment would only be possible with the informed consent of the user or if the data are anonymised and the risks are adequately mitigated(a third exception is, of course, when accessing such data is being done for the purposes of establishing a connection). This is a significant change compared to Commission’s text, which allowed such tracking in principle, provided the user is informed and is given the possibility to opt-out (“stop or minimise the collection”) (see Amendments 85, 86). The draft report also proposes a new paragraph to Article 8 containing measures to mitigate risks, including only collecting data for the purpose of “statistical counting”, anonymisation or deletion of data “immediately after the purpose is fulfilled”, and effective opt-out possibilities.

15) Significantly stronger obligations for privacy by default are proposed, with a clear preference for Do-Not-Track mechanism. Article 10 is enhanced so that all software placed on the market must “by default, offer privacy protective settings to prevent other parties from storing information on the terminal equipment of a user and from processing information already stored on that equipment” (see Amendment 95). Opt-outs shall be available upon installation (Amendment 96). What is remarkable is a new obligation that “the settings shall include a signal which is sent to the other parties to inform them about the user’s privacy settings. These settings shall be binding on, and enforceable against, any other party” (see Amendment 99). The rapporteur explains at the end of the Report that “the settings should allow for granulation of consent by the user, taking into account the functionality of cookies and tracking techniques and DNTs should send signals to the other parties informing them of the user’s privacy settings. Compliance with these settings should be legally binding and enforceable against all other parties”.

16) A national Do Not Call register is proposed for opting out of unsolicited voice-to-voice marketing calls (see Amendment 111).

17) End-to-end encryption proposed as security default measure for ensuring confidentiality of communications.Additionally, strong wording is included to prevent Member States from introducing measures amounting to backdoors. Under the title of “integrity of the communications and information about security risks”, Article 17 is amended to include a newly introduced paragraph that states: “The providers of electronic communications services shall ensure that there is sufficient protection in place against unauthorised access or alterations to the electronic communications data, and that the confidentiality and safety of the transmission are also guaranteed by the nature of the means of transmission used or by state-of-the-art end-to-end encryption of the electronic communications data. Furthermore, when encryption of electronic communications data is used, decryption, reverse engineering or monitoring of such communications shall be prohibited. Member States shall not impose any obligations on electronic communications service providers that would result in the weakening of the security and encryption of their networks and services“(my highlight) (see Amendment 116).

18) The possibility of class actions for infringement of the ePrivacy reg is introduced. End-users would have “the right to mandate a not-for-profit body, organisation or association” to lodge complaints or to seek judicial remedies on their behalf (see Amendments 125 and 126).

19) Infringement of obligations covered by Article 8 (cookies, wi-fi tracking) would be sanctioned with the first tier of fines (the highest ones – up to 20 mill. EUR or 4% of global annual turnover), which is not the case in the Commission’s proposal (see Amendment 131).

As a bonus, here’s the 20th highlight:

20) Echoing the debate over data analytics using pshycographic measurements to influence elections, the report amends an important recital (Recital 20) to refer to the fact that information on terminal equipments may reveal “very sensitive data”, including “details of the behaviour, psychological features, emotional condition and political and social preferences of an individual“. Among other reasons, this justifies the principle that any interference with the user’s terminal equipment should be allowed only with the user’s consent and for specific and transparent purposes (see Amendment 20; also, watch this video from last week’s Digital Assembly in Malta, where around min. 6 the Rapporteur talks about this and points out that “without privacy there will be no democracy”).

The Court of Justice of the EU received questions for a preliminary ruling from Finland regarding the practice of a religious group (Jehova’s Witnesses) to gather and record data after door-to-door visits, without informing the concerned individuals about this practice. The questions referred in Case C-25/17 Tietosuojavaltuutettu v Jehovah’s Witnesses concern the interpretation of several key points of Directive 95/45:

Exceptions from the application of the Directive – and particularly Article 3(2) second paragraph, which excludes processing “by a natural person in the course of a purely personal or household activity” from the material scope of the Directive. The referring court wants the CJEU to clarify whether this exception applies to gathering data and writing observations in paper file connected to the door-to-door activity, by members of the religious group (Question 1).

The concept of “filing system” as defined in Article 2(d) of the Directive.The question referred by the national Court is whether, taken as a whole, the manual collection of personal data (name and address and other information and characteristics of a person) carried out in connection with door-to-door evangelical work constitutes a filing system, being thus subject to the application of the Directive (Question 2).

The concept of “controller” under Article 2(d) of the Directive. In particular, the referring court wants the CJEU to clarify whether in this situation the controller is considered to be the religious community as a whole, “even though the religious community claims that only the individual members carrying out evangelical work have access to the data collected” (Questions 3 and 4).

Without knowing the details of the case, and based only on the information available in the questions referred by the national Court, here is my bet on how the CJEU will reply:

The definition of “purely household activity” does not extend to the door-to-door evangelical work of a religious community; this exemption is to be interpreted strictly (“must be narrowly construed”; “must apply only in so far as is strictly necessary”), according to the CJEU in C-212/13 Rynes (§28 and §29). The CJEU also explained that this exception applies “only where it is carried out in the purely personal or household setting of the person processing the data” (§31) – which is not the case of representatives of a religious community gathering information during evangelical work.

The records the evangelical workers keep should be considered as constituting a “filing system”. This concept is defined as “any structured set of personal data which are accessible according to specific criteria, whether centralized, decentralized or dispersed on a functional or geographical basis”. According to Recital 15 of the Directive, data in a filing system is “structured according to specific criteria relating to individuals, so as to permit easy access to the personal data in question”. If the religious community would claim that their records are not structured according to specific criteria – e.g. ZIP codes; members of the community/non-members; individuals interested in the community/individuals not interested, and that they don’t allow easy access to the personal data in question, then the purpose of having a detailed record would not be achieved. In other words, having an unstructured file is incongruent with the purpose of the activity. While it is true that the Member States have been given a margin of appreciation to lay down different criteria for determining the constituents of a structured set of personal data and the different criteria governing access to such a set, the criteria must be compatible with the definition in the Directive. Moreover, applying “loosely” the definition would amount to a limitation in relation to the protection of personal data, which must apply “only in so far as is strictly necessary” (Rynes §28, DRI §52).

The controller of this processing operation should be considered the religious community, as this entity establishes the purposes of the processing activity (the records are probably meant to facilitate the evangelical work of the community – there is no reference in the questions sent to the declared purpose of this activity, but it is only logical that such records are kept to facilitate the evangelical work) and the means of this activity (“by dividing up the areas in which the activity is carried out among members involved in evangelical work, supervising the work of those members and maintaining a list of individuals who do not wish to receive visits from evangelists” – according to the referring Court)

Since this new case provided an opportunity to discuss processing of personal data done by a religious community, there are a couple of additional points to be made.

First of all, according to Recital 35 of the Directive, “processing of personal data by official authorities for achieving aims, laid down in constitutional law or international public law, of officially recognized religious associations is carried out on important grounds of public interest“. This means that the religious associations do not need to rely on consent or on their legitimate interest as lawful grounds for processing. However, relying on public interest for the lawful ground of processing does not mean that they don’t have to comply with all the other obligations under data protection law. For instance, they still have to comply with the data quality principles, they still have to inform data subjects about the details of the processing activity and they still have to reply to requests of access, correction, erasure.

Second, some of the data gathered in such circumstances is sensitive data, as it refers to “religious beliefs” (Article 8 of the Directive, Article 9 of the GDPR). This means that the data should be processed with additional care and strengthened safeguards.

In case you are wondering whether the GDPR specifically addresses processing of data by religious communities, churches, Recital 35 of the Directive was transplanted to the GDPR, in Recital 55. In addition, the GDPR enshrines a specific provision that covers “existing data protection rules of churches and religious associations” – Article 91. This provision allows Member States that have specific legislation (“comprehensive rules”) in place dedicated to churches and religious communities, at the time of entry into force of the GDPR, to continue to apply those rules, but only if “they are brought into line with this Regulation”. In addition, according to the second paragraph, processing of personal data done by churches and religious associations that apply comprehensive national rules according to the first paragraph “shall be subject to the supervision of an independent supervisory authority, which may be specific”. Again, the conditions for this to happen is that this specific supervisory authority must fulfil the conditions laid down for independent supervisory authorities in the GDPR.

***

Note: Thanks to Dr. Mihaela Mazilu-Babel for pointing out this new case.

Mapping all processing activities (the proposed step goes far beyond data mapping, as it refers to processing operations themselves, not only to the data being processed, it also refers to cataloging the purposes of the processing operations and identifying all sub-contractors relevant for the processing operations);

Prioritising the compliance actionsto be taken, using as starting point the Register and structuring the actions on the basis of the risks the processing operations pose to the rights and freedoms of individuals whose data are processed. Such actions could be, for instance, making sure that they process only the personal data necessary to achieve the purposes envisaged or revising/updating the Notice given to individuals whose data are processed (Articles 12, 13 and 14 of the Regulation);

Managing therisks, which meansconducting DPIAs for all processing operations envisaged that may potentially result in a high risk for the rights of individuals. CNIL mentions that the DPIA should be done before collecting personal data and before putting in place the processing operation and that it should contain a description of the processing operation and its purposes; an assessment of the necessity and the proportionality of the proposed processing operation; an estimation of the risks posed to the rights and freedoms of the data subjects and the measures proposed to address these risks in order to ensure compliance with the GDPR.

Organising internal proceduresthat ensure continuous data protection compliance, taking into account all possible scenarios that could intervene in the lifecycle of a processing operation. The procedures could refer to handling complaints, ensuring data protection by design, preparing for possible data breaches and creating a training program for employees.

Finally, and quite importantly, Documenting compliance. “The actions taken and documents drafted for each step should be reviewed and updated periodically in order to ensure continuous data protection”, according to the CNIL. The French DPA provides a list with documents that should be part of the “GDPR compliance file”, such as the Register of processing operations and the contracts with processors.

While this guidance is certainly helpful, it should be taken into account that the only EU-wide official guidance is the one adopted by the Article 29 Working Party. For the moment, the Working Party published three Guidelines for the application of the GDPR – on the role of the DPO, on the right to data portability and on identifying the lead supervisory authority. The Group is expected to adopt during the next plenary guidance for Data Protection Impact Assessments.

If you are interested in other guidance issued by individual DPAs, here are some links:

The ICO, on consent under the GDPR (the draft is under consultation until 31 March, with the aim for the final draft to be published in May);

The recent judgment of the CJEU in Case C-398/15 Manni (9 March 2017) brings a couple of significant points to the EU data protection case-law:

Clarifies that an individual seeking to limit the access to his/her personal data published in a Companies Register does not have the right to obtain erasure of that data, not even after his/her company ceased to exist;

Clarifies that, however, that individual has the right to object to the processing of that data, based on his/her particular circumstances and on justified grounds;

Clarifies the link between the purpose of the processing activity and the data retention period, and underlines how important is the purpose of the processing activity when analysing whether a data subject can obtain erasure or blocking of data.

Provides insight into the balancing exercise between interests of third parties to have access to data published in the Companies Register and the rights of the individual to obtain erasure of the data and to object to its processing.

This commentary will highlight all points enumerated above.

1. Facts of the case

Mr Manni had requested his regional Chamber of Commerce to erase his personal data from the Public Registry of Companies, after he found out that he was losing clients who performed background checks on him through a private company that specialised in finding information in the Public Registry. This happened because Mr Manni had been an administrator of a company that was declared bankrupt more than 10 years before the facts in the main proceedings. In fact, the former company itself was radiated from the Public Registry (§23 to §29).

2. The question in Manni

The question that the CJEU had to answer in Manni was whether the obligation of Member States to keep public Companies Registers[1] and the requirement that personal data must be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected[2] must be interpreted as meaning that individuals must be allowed to “request the authority responsible for maintaining the Companies Register to limit, after a certain period has elapsed from the dissolution of the company concerned and on the basis of a case-by-case assessment, access to personal data concerning them and entered in that register” (§30).

First, CJEU clarified that its analysis does not concern processing of data by the specialized rating company, and it only refers to the obligations of the public authority keeping the companies register (§31). Second, the CJEU ascertained that the provisions of the DPD are applicable in this case:

the identification data of Mr Manni recorded in the Register is personal data[3] – “the fact that information was provided as part of a professional activity does not mean that it cannot be characterized as personal data” (§34);

the authority keeping the register is a “controller”[4] that carries out “processing of personal data”[5] by “transcribing and keeping that information in the register and communicating it, where appropriate, on request to third parties” (§35).

4. The role of the data quality principles and the legitimate grounds for processing in ensuring a high level of protection of fundamental rights

Further, CJEU recalls its case-law stating that the DPD “seeks to ensure a high level of protection of the fundamental rights and freedoms of natural persons” (§37) and that the provisions of the DPD “must necessarily be interpreted in the light of the fundamental rights guaranteed by the Charter”, and especially Articles 7 – respect for private life and 8 – protection of personal data (§39). The Court recalls the content of Articles 7 and 8 and specifically lays out that the requirements under Article 8 Charter “are implemented inter alia in Articles 6, 7, 12, 14 and 28 of Directive 95/46” (§40).

The Court highlights the significance of the data quality principles and the legitimate grounds for processing under the DPD in the context of ensuring a high level of protection of fundamental rights:

“[S]ubject to the exceptions permitted under Article 13 of that directive, all processing of personal data must comply, first, with the principles relating to data quality set out in Article 6 of the directive and, secondly, with one of the criteria for making data processing legitimate listed in Article 7 of the directive” (§41 and case-law cited).

The Court applies this test in reverse order, which is, indeed, more logical. A processing activity should, first, be legitimate under one of the lawful grounds for processing and only after ascertaining that this is the case, the question of compliance with the data quality principles should arise.

CJEU finds that in the case at hand the processing activity is legitimized by three lawful grounds (§42, §43):

compliance with a legal obligation [Article 7(c)];

the exercise of official authority or the performance of a task carried out in the public interest [Article 7(e)] and

the realization of a legitimate interest pursued by the controller or by the third parties to whom the data are disclosed [Article 7(f)].

5. The link between the data retention principle, the right to erasure and the right to object

Article 6(1)(e) of the DPD requires that personal data are kept in a form which permits identification of data subjects for no longer than what is necessary for the purposes for which the data were collected or for which they are further processed. This means that controllers should only retain personal data up until it serves the purpose for which it was processed and automatically anonymise, erase or otherwise make unavailable that data. If the controller does not comply with this obligation, the data subject has two possible avenues to stop the processing: he/she can either ask for erasure of that data, or they can object to the processing based on their particular situation and a justified objection.

CJEU explains that “in the event of failure to comply with the condition laid down in Article 6(1)(e)” of the DPD, “Member States guarantee the person concerned, pursuant to Article 12(b) thereof, the right to obtain from the controller, as appropriate, the erasure or blocking of the data concerned” (§46 and C-131/12 Google/Spain §70).

In addition, the Court explains, Member States also must “grant the data subject the right, inter alia in the cases referred to in Article 7(e) and (f) of that directive, to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him, save where otherwise provided by national legislation”, pursuant to Article 14(a) DPD (§47).

The CJEU further explains that “the balancing to be carried out under subparagraph (a) of the first paragraph of Article 14 … enables account to be taken in a more specific manner of all the circumstances surrounding the data subject’s particular situation. Where there is a justified objection, the processing instigated by the controller may no longer involve those data” (§47).

6. The pivotal role of the purpose of the processing activity in granting the right to erasure and the right to object

After establishing these general rules, the Court decides that in order to establish where data subjects have the “right to apply to the authority responsible for keeping the register to erase or block the personal data entered in that register after a certain period of time, or to restrict access to it, it is first necessary to ascertain the purpose of that registration” (§48).

The pivotal role of the purpose of the processing operation should not come as a surprise, given the fact that the data retention principle is tightly linked to accomplishing the purpose of the processing operation.

In this case, the Court looked closely at Directive 68/151 and explained at length that the purpose of the disclosure provided for by it is “to protect in particular the interests of third parties in relation to joint stock companies and limited liability companies, since the only safeguards they offer to third parties are their assets” (§49) and “to guarantee legal certainty in relation to dealings between companies and third parties in view of the intensification of trade between Member States” (§50). CJEU also referred to primary EU law, and specifically to Article 54(3)(g) EEC, one of the legal bases of the directive, which “refers to the need to protect the interests of third parties generally, without distinguishing or excluding any categories falling within the ambit of that term” (§51).

The Court further noted that Directive 68/151 makes no express provision regarding the necessity of keeping personal data in the Companies Register “also after the activity has ceased and the company concerned has been dissolved” (§52). However, the Court notes that “it is common ground that even after the dissolution of a company, rights and legal relations relating to it continue to exist” (§53) and “questions requiring such data may arise for many years after a company has ceased to exist” (§54).

Finally, CJEU declared:

“in view of the range of possible scenarios … it seems impossible, at present, to identify a single time limit, as from the dissolution of a company, at the end of which the inclusion of such data in the register and their disclosure would no longer be necessary” (§55).

7. Conclusion A: there is no right to erasure

The Court concluded that “in those circumstances” the data retention principle in Article 6(1)(e) DPD and the right to erasure in Article 12(b) DPD do not guarantee for the data subjects referred to in Directive 68/151 a right to obtain “as a matter of principle, after a certain period of time from the dissolution of the company concerned, the erasure of personal data concerning them” (§56).

After already reaching this conclusion, the Court also explained that this interpretation of the provisions in question does not resultin “disproportionate interference with the fundamental rights of the persons concerned, and particularly their right to respect for private life and their right to protection of personal data as guaranteed by Articles 7 and 8 of the Charter” (§57).

To this end, the Court took into account:

that Directive 68/151 requires “disclosure only for a limited number of personal data items” (§58) and

that “it appears justified that natural persons who choose to participate in trade through such a company are required to disclose the data relating to their identity and functions within that company, especially since they are aware of that requirement when they decide to engage in such activity” (§59).

8. Conclusion B: but there is a right to object

After acknowledging that, in principle, the need to protect the interests of third parties in relation to joint-stock companies and limited liability companies and to ensure legal certainty, fair trading and thus the proper functioning of the internal market take precedence over the right of the data subject to object under Article 14 DPD, the Court points out that

“it cannot be excluded, however, that there may be specific situations in which the overriding and legitimate reasons relating to the specific case of the person concerned justify exceptionally that access to personal data entered in the register is limited, upon expiry of a sufficiently long period after the dissolution of the company in question, to third parties who can demonstrate a specific interest in their consultation” (§60).

While the Court leaves it to the national courts to assess each case “having regard to all the relevant circumstances and taking into account the time elapsed since the dissolution of the company concerned”, it also points out that, in the case of Mr Manni, “the mere fact that, allegedly, the properties of a tourist complex built … do not sell because of the fact that potential purchasers of those properties have access to that data in the company register, cannot be regarded as constituting such a reason, in particular in view of the legitimate interest of those purchasers in having that information” (§63).

9. Post Scriptum

The Court took a very pragmatic approach in dealing with the case of Mr Manni. The principles of interpretation it laid down are solid – such an analysis indeed requires looking at the legitimate grounds for processing and the relevant data quality principle. Having the Court placing strong emphasis on the significance of the purpose of the processing activity is welcome, just like having more guidance on the balancing exercise of the rights and interests in question. In addition, a separate assessment of the right to obtain erasure and of the right to object is very helpful with a view towards the future – the full entering into force of the GDPR and its heightened rights of the data subject.

The aspect of the judgment that leaves some room for improvement is analysing the proportionality of the interference of the virtually unlimited publishing of personal data in the Companies Register with Articles 7 and 8 of the Charter. The Court does tackle this, but lightly – and it brings two arguments only after already declaring that the interference is not disproportionate. Moreover, the Court does not distinguish between interferences with Article 7 and interferences with Article 8.

Finally, I was happy to see that the predicted outcome of the case, as announced in the pdpEcho commentary on the Opinion of the Advocate General Bot, proved to be mainly correct: “the Court will follow the AG’s Opinion to a large extent. However, it may be more focused on the fundamental rights aspect of balancing the two Directives and it may actually analyse the content of the right to erasure and its exceptions. The outcome, however, is likely to be the same.”

Suggested citation:G. Zanfir-Fortuna, “CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object”, pdpEcho.com, 13 March 2017.

The Conseil d’Etat announced today that it referred several questions to the Court of Justice of the EU concerning the interpretation of the right to be forgotten, pursuant to Directive 95/46 and following the CJEU’s landmark decision in the Google v Spain case.

The questions were raised within proceedings involving the application of four individuals to the Conseil d’Etat to have decisions issued by the CNIL (French DPA) quashed. These decisions rejected their requests for injunctions against Google to have certain Google Search results delisted.

According to the press release of the Conseil d’Etat, “these requests were aimed at removing links relating to various pieces of information :a video that explicitly revealed the nature of the relationship that an applicant was deemed to have entertained with a person holding a public office; a press article relating to the suicide committed by a member of the Church of Scientology, mentioning that one of the applicants was the public relations manager of that Church; various articles relating to criminal proceedings concerning an applicant; and articles relating the conviction of another applicant for having sexually aggressed minors.

The Conseil d’Etat further explained that in order to rule on these claims, it has deemed necessary to answer a number of questions “raising serious issues with regard to the interpretation of European law in the light of the European Court of Justice’s judgment in its Google Spain case.

Such issues are in relation with the obligations applying to the operator of a search engine with regard to web pages that contain sensitive data, when collecting and processing such information is illegal or very narrowly framed by legislation, on the grounds of its content relating to sexual orientations, political, religious or philosophical opinions, criminal offences, convictions or safety measures. On that point, the cases brought before the Conseil d’Etat raise questions in close connection with the obligations that lie on the operator of a search engine, when such information is embedded in a press article or when the content that relates to it is false or incomplete”.

A new case received by the General Court of the CJEU was published in the Official Journal of the EU in February, Case T-881/16 HJ v EMA.

A British citizen seeks to engage the non-contractual liability of the European Medicines Agency for breaching data protection law. The applicant claims that “the documents in his personal file, which were made public and accessible to any member of staff of the European Medicines Agency for a period of time, were not processed fairly and lawfully but were processed for purposes other than those for which they were collected without that change in purpose having been expressly authorised by the applicant”.

Further, the applicant claims that “the dissemination of that sensitive data consequently called into question the applicant’s integrity, causing him real and certain non-material harm”.

The applicant asks the Court to “order the defendant to pay the applicant the symbolic sum of EUR 1 by way of compensation for the non-material harm suffered”.

Even if in the published summary there is no mention of the applicable law, it is clear that Regulation 45/2001 is relevant in this case – the data protection regulation applicable to EU institutions and bodies (EMA is an EU body). The rules of Regulation 45/2001 are fairly similar to those of Directive 95/46.

Wordpress.com uses cookies on this blog. I've limited them as much as customization allows me & I have no access to or control over the personal data they collect. Consent will be recorded after you click the button, and not just by mere scrolling. The widget doesn't provide an "I refuse" button & I'm writing to Wordpress to fix this. In the meantime, see their
Cookie Policy