Category Archives: EU Data News

When are they needed? How are they done?

Next year under the new GDPR data protection legislation, Privacy Impact Assessments will become known as Data Privacy Impact Assessments, and will be mandatory instead of merely recommended.

The ICO currently describes PIAs as “a tool which can help organisations identify the most effective way to comply with their data protection obligations and meet individuals’ expectations of privacy.”

While the soon-to-be-rechristened DPIAs will be legally required, data controllers should continue to fully embrace these opportunities to ensure that heavy fines, brand reputational damage and the associated risks of data breaches can be averted from an early stage in any planned operation.

When will a DPIA be legally required?

Organisations will be required to carry out a DPIA when data processing is “likely to result in a high risk to the rights and freedoms of individuals.” This can be during an existing or before a planned project involving data processing that comes with a risk to the rights of individuals as provided by the Data Protection Act. They can also range in scope, depending on the organisation and the scale of its project.

DPIAs will therefore be required when an organisation is planning an operation that could affect anyone’s right to privacy: broadly speaking, anyone’s right ‘to be left alone.’ DPIAs are primarily designed to allow organisations to avoid breaching an individual’s freedom to “control, edit, manage or delete information about themselves and to decide how and to what extent such information is communicated to others.” If there is a risk of any such breach, a DPIA must be followed through.

Listed below are examples of projects, varying in scale, in which the current PIA is advised – and it is safe to assume all of these examples will necessitate a DPIA after the GDPR comes into force:

A new IT system for storing and accessing personal data.

A new use of technology such as an app.

A data sharing initiative where two or more organisations (even if they are part of the same group company) seek to pool or link sets of personal data.

A proposal to identify people in a particular group or demographic and initiate a course of action.

Processing quantities of sensitive personal data

Using existing data for a new and unexpected or more intrusive purpose.

A new surveillance system (especially one which monitors members of the public) or the application of new technology to an existing system (for example adding Automatic number plate recognition capabilities to existing CCTV).

A new database which consolidates information held by separate parts of an organisation.

Legislation, policy or strategies which will impact on privacy through the collection of use of information, or through surveillance or other monitoring

How is a DPIA carried out?

There are 7 main steps that comprise a DPIA:

Identify the need for a DPIA

This will mainly involve answering ‘screening questions,’ at an early stage in a project’s development, to identify the potential impacts on individuals’ privacy. The project management should begin to think about how they can address these issues, while consulting with stakeholders.

Describe the information flows

Explain how information will be obtained, used and retained. This part of the process can identify the potential for – and help to avoid – ‘function creep’: when data ends up being processed or used unintentionally, or unforeseeably.

Identify the privacy and related risks

Compile a record of the risks to individuals in terms of possibly intrusions of data privacy as well as corporate risks or risks to the organisation in terms of regulatory action, reputational damage and loss of public trust. This involves a compliance check with the Data Protection Act and the GDPR.

Identify and evaluate the privacy solutions

With the record of risks ready, devise a number of solutions to eliminate or minimise these risks, and evaluate the costs and benefits of each approach. Consider the overall impact of each privacy solution.

Sign off and record the DPIA outcomes

Obtain appropriate sign-offs and acknowledgements throughout the organisation. A report based on the findings and conclusions of the prior steps of the DPIA should be published and accessible for consultation throughout the project.

Integrate the outcomes into the project plan

Ensure that the DPIA is implemented into the overall project plan. The DPIA should be utilised as an integral component throughout the development and execution of the project.

Consult with internal and external stakeholders as needed throughout the process

This is not a ‘step’ as such, but an ongoing commitment to stakeholders to be transparent about the process of carrying out the DPIA, and being open to consultation and the expertise and knowledge of the organisation’s various stakeholders – from colleagues to customers. The ICO explains, “data protection risks are more likely to remain unmitigated on projects which have not involved discussions with the people building a system or carrying out procedures.”

DPIAs – what are the benefits?

There are benefits to DPIAs for organisations who conduct them. Certainly there are cost benefits to be gained from knowing the risks before starting work:

cost benefits from adopting a Privacy by Design approach: knowing the risks before starting work allows issues to be fixed early, resulting in reduced development costs and delays to the schedule

risk mitigation in relation to fines and loss of sales caused by lack of customer and/or shareholder confidence

reputational benefits and trust building from being seen to consider and embed privacy issues into a programme’s design from the outset

For more information about DPIAs and how Data Compliant can help, please email dc@datacompliant.co.uk.

The Information Commissioner’s Office (ICO) has updated its ‘Code of Practice on Subject Access Requests’ chiefly in response to several Court of Appeal decisions made earlier this year related to SARs. Under the Data Protection Act 1998, individuals (‘data subjects’) may request access to their personal information held by a ‘data controller.’

These requests for information are called SARs, and can range from the request for specific or limited information to the request for the entirety of held information including why it is held and to whom it may have been disclosed. The scope of a data controller’s obligations, therefore, will vary from case to case, and will be particularly burdensome for large organisations. Currently, data controllers may charge a fee of up to £10 for processing a SAR, and must provide the requester the relevant information within 40 calendar days. When the GDPR comes into force next year, data controllers will normally not be entitled to charge a fee, irrespective of the inconvenience, and will be expected to provide the information within a shorter timeframe of 30 calendar days.

However, the ICO has revised its guidance in dealing with SARs to prepare controllers for data compliance in light of the Court of Appeal’s judgements on a string of cases in which SARs took place alongside ongoing or threatened litigation – cases which in the opinion of numerous legal commentators, therefore, highlight the potential for widespread abuse of SARs to redress grievances outside the purview of data protection law.

The three key changes to the ICO’s Code

Scope for assessing ‘disproportionate effort’

The DPA includes an exemption from having to respond to SARs if this would involve ‘disproportionate effort’ for the data controller. Whereas the Code previously indicated that a refusal to provide information on the grounds of it being difficult is unacceptable, it now, with greater lenience, states: “there is scope for assessing whether, in the circumstances of a particular case, supplying a copy of the requested information in permanent form would result in so much work or expense as to outweigh the requester’s right of access to their personal data.” The ICO expects controllers to evaluate the benefits to the data subject as a result of the SAR against the difficulties in complying with the request, and assess whether the scope of the request is reasonable.

Dialogue between controller and requester

The ICO now advises controllers to enter into dialogue with data subjects following a SAR. This may allow the requester to specify which information they require, thereby refining the request, and making the process more manageable and less likely to result in disproportionate effort. The Code continues to explain how it will take into account both controller’s and subject’s willingness to participate in this dialogue if they receive a complaint about the handling of a SAR.

Information management systems and redaction of third-party data

The ICO now expects controllers to have information management systems wherein personal information, including archived or back-up data, can be found expediently in anticipation of a SAR. Moreover, the information management system should allow for the redaction of third-party data. This is important, since certain SARs may be declined if the information requested would result some way in the disclosure of personal information about another living person.

Corrupted Ukrainian accountancy software ‘MEDoc’ is suspected to be the medium of a cyberattack on companies ranging from British ad agency WPP to Tasmanian Cadbury’s factory, with many European and American firms reporting disruption to services. Banks in Ukraine, Russian oil giant Rosneft, shipping giant Maersk, a Rotterdam port operator, Dutch global parcel service TNT and US law firm DLA Piper were among those suffering inabilities to process orders or else general computer shutdowns.

Heralded as “a recent dangerous trend” by Microsoft, this attack comes just 6 weeks after the WannaCry attack primarily affecting NHS hospitals. Both attacks appear to make use of a Windows vulnerability called ‘Eternal Blue,’ thought to have been discovered by the NSA and leaked online – although the NSA has not confirmed this. The NSA’s possible use of this vulnerability, which has served to create a model for cyber-attacks for political and criminal hackers, has been described by security experts as “a nightmare scenario.”

A BBC report suggests that given 80% of all instances of this malware were in Ukraine, and that the provided email address for the ‘ransom’ closed down quickly, the attack could be politically motivated at Ukraine or those who do business in Ukraine. Recent announcements suggest it could be related to data not money.

The malware appears to have been channelled through the automatic update system, according to security experts including the malware expert credited with ending the WannaCry attack, Marcus Hutchins. The MEDoc software would have originally begun this process legitimately, but at some point the update system released the malware into numerous companies’ computer systems.

In a blog published at the end of last week, the tech firm Google have confirmed that they will stop scanning Gmail users’ emails for the sake of accruing data to be used in personalised adverts, by the end of the year. This will put the consumer version of Gmail in line with the business edition.

Google had advertised their Gmail service by offering 1GB of ‘free’ webmail storage. However, it transpired that Google was paying for this offer by running these scans.

This recent change in tactic has been met with ‘qualified’ welcome by privacy campaigners. Executive director Dr Gus Hosein of Privacy International, the British charity who have been campaigning for regulators to intervene since they discovered the scans, stated:

When they first came up with the dangerous idea of monetising the content of our communications, Privacy International warned Google against setting the precedent of breaking the confidentiality of messages for the sake of additional income. […] Of course they can now take this decision after they have consolidated their position in the marketplace as the aggregator of nearly all the data on internet usage, aside from the other giant, Facebook.

Google faced a fairly substantial backlash on account of these scans when they were discovered, notably from Microsoft, with their series of critical ‘Gmail man’ adverts, depicting a man searching through people’s messages.

However, digital rights watchdog Big Brother Watch celebrated Google’s move, describing it as “absolutely a step in the right direction, let’s hope it encourages others to follow suit.”

UK Conservative Party under investigation for breaching data protection and election law

A Channel 4 News undercover investigation has provoked ‘serious allegations’ of data protection and election offences against the Conservative Party.

The investigation uncovered the party’s use of a market research firm based in Neath, South Wales, to make thousands of cold calls to voters in marginal seats ahead of the election this month. Call centre staff followed a ‘market research’ script, but under scrutiny this script appears to canvass for specific local Conservative candidates – in a severe breach of election law.

Despite the information commissioner Elizabeth Denham’s written warnings to all major parties before the election began, reminding them of data protection law and the illegality of such telecommunications, the Conservatives operated a fake market research company. This constitutes a breach separate to election law, and mandates the Information Commissioner’s Office to investigate.

The ICO’s statement on 23rd June reads,

The investigation has uncovered what appear to be underhand and potentially unlawful practices at the centre, in calls made on behalf of the Conservative Party. These allegations include:

Misleading calls claiming to be from an ‘independent market research company’ which does not apparently exist

MyHome Installations Ltd fined £50,000 for nuisance calls

Facing somewhat less public scrutiny and condemnation than the Conservative Party, Maidstone domestic security firm MyHome Installations has been issued a £50,000 fine by the ICO for making nuisance calls.

The people who received these calls had explicitly opted out of telephone marketing by registering their numbers with the Telephone Preference Service (TPS), the “UK’s official opt-out of telephone marketing.”

The ICO received 169 complaints from members of the public who’d received unwanted calls about electrical surveys and home security from MyHome Installations Ltd.

The ICO, the independent authority responsible for investigating breaches of data protection law, has fined the fourth largest supermarket chain in the UK £10,500 for sending 130,671 of their customers’ unsolicited marketing emails.

These customers had explicitly opted-out of receiving marketing emails related to their Morrisons ‘More’ loyalty card when they signed up to the scheme. In October and November 2016, Morrisons used the email addresses associated with these loyalty cards to promote various deals. This is in contravention of laws defining the misuse of personal information, which stipulate that individuals must give consent to receive personal ‘direct’ marketing via email.

‘Service emails’ versus ‘Marketing emails’

While the emails’ subject heading was ‘Your Account Details,’ the customers were told that by changing the marketing preferences on their loyalty card account, they could receive money off coupons, extra More Points and the company’s latest news.

The subject heading might suggest to the recipient that they are ‘service emails,’ which are defined under the Data Protection Act 1998 (DPA) as any email an organisation has a legal obligation to send, or an email without which an individual would be disadvantaged (for instance, a reminder for a booked train departure). But there is a fine line between a service email and a marketing email: if an email contains any brand promotion or advertising content whatsoever, it is deemed the latter under the DPA. Emails that ask for clarification on marketing preferences are still marketing emails and a misuse of personal contact data.

Morrisons explained to the ICO that the recipients of these emails had opted-in to marketing related to online groceries, but opted-out of marketing related to their loyalty cards, so emails had been sent for the ostensible purpose of qualifying marketing preferences which also included promotional content. Morrisons could not provide evidence that these customers had consented to receiving this type of email, however, and they were duly fined – although in cases such as this it is often the losses from reputational damage that businesses fear more.

Fines and reputational damage

This comes just three months after the ICO confirmed fines – for almost identical breaches of PECR – of £13,000 and £70,000 for Honda and Exeter-based airline Flybe respectively. Whereas Honda could not prove that 289,790 customers had given consent to direct e-marketing, Flybe disregarded 3.3 million addressees’ explicit wishes to not receive marketing emails.

Even a fine of £70,000 – which can currently be subject to a 20% early payment discount – for sending out emails to existing customers with some roundabout content in them for the sake of promotion, will seem charitable when the General Data Protection Regulation (GDPR) updates the PECR and DPA in 2018. Under the new regulations, misuse of data including illegal marketing risks a fine of up to €20 million or 4% of annual global turnover.

The ICO has acknowledged Honda’s belief that their emails were a means of helping their firm remain compliant with data protection law, and that the authority “recognises that companies will be reviewing how they obtain customer consent for marketing to comply with stronger data protection legislation coming into force in May 2018.”

These three cases are forewarnings of the imminent rise in stakes for not marketing in compliance with data protection law. The GDPR, an EU regulation that will demand British businesses’ compliance irrespective of Brexit, not only massively increases the monetary penalty for non-compliance, but also demands greater accountability to individuals with regard to the use and storage of their personal data.

The regulators recent actions show that companies will not be able cut legal corners under the assumption of ambiguity between general service and implicit promotional emails. And with the GDPR coming into force next year, adherence to data protection regulations is something marketing departments will need to find the time and resources to prepare for.

As part of several of measures aimed at “making our country safer and more united,” a new Data Protection Bill has been announced in the Queen’s Speech.

The Bill, which follows up proposals in the Conservative manifesto ahead of the election in June, is designed to make the UK’s data protection framework “suitable for our new digital age, allowing citizens to better control their data.”

The intentions behind the Bill are to:

Give people more rights over the use and storage of their personal information. Social media platforms will be required to delete data gathered about people prior to them turning 18. The ‘right to be forgotten’ is enshrined in the Bill’s requirement of organisations to delete an individual’s data on request or when there are “no longer legitimate grounds for retaining it.”

Implement the EU’s General Data Protection Regulation, and the new Directive which applies to law enforcement data processing. This meets the UK’s obligations to international law enforcement during its time as an EU member state and provides the UK with a system to share data internationally after Brexit is finalised.

To update the powers and sanctions available to the Information Commissioner.

Strengthen the UK’s competitive position in technological innovation and digital markets by providing a safe framework for data sharing and a robust personal data protection regime.

Ensure that police and judicial authorities can continue to exchange information “with international partners in the fight against terrorism and other serious crimes.”

Ultimately, the Bill seeks to modernise the UK’s data protection regime and to secure British citizens’ ability to control the processing and application of their personal information. The Queen’s Speech expressed the Government’s concern not only over law enforcement, but also the digital economy: over 70% of all trade in services are enabled by data flows, making data protection critical to international trade, and in 2015, the digital sector contributed £118 billion to the economy and employed over 1.4 million people across the UK.

Something that is being spoken about more and more (due to the unfortunate higher frequency) is insider threat. It’s in the news an awful lot more than it ever used to be.

Do you remember the auditor of Morrisons who released a spreadsheet detailing just shy of 100,000 members of staff’s (very) personal details? He did end up getting jailed for 8 years but I heard a saying recently, it’s not a digital footprint you leave it’s more of a digital tattoo. Even two years after the incident Morrisons is still suffering the effects.

Now obviously that was what you would call a malicious breach. It does unfortunately happen, but there are ways for you to protect your company against this. Firstly we here at Data Compliant believe that if you have detailed joiner processes in place (i.e. thorough screening and references and criminal checks where appropriate), ongoing appraisals with staff and good leaver processes you can minimise your risk.

Other ways of insider breaches occurring, and much more likely in my opinion, are negligence, carelessness and genuine accidents. Did you know that over 50% of data breaches are cause by staff error? This may be because staff do not follow company procedures correctly and open up pathways for hackers. Or it could be that your staff are tricked into handing over information that they shouldn’t.

Your staff could be your company’s weakest point in relation to protecting it’s personal and confidential data. But you can take simple steps to minimise this risk by training your staff in data protection.

Online training has some big advantages for businesses, it’s a quick, efficient and relatively inexpensive way of training large numbers of employees while “taking them out of the business” for the least possible time.

The risk of breaches isn’t just your business’ reputation, or even a hefty fine from the ICO but as mentioned before, also a criminal conviction. Now that is a lot to risk.

At last agreement has been reached on the EU – US Privacy Shield agreement which now replaces the Safe Harbor agreement. Safe Harbor was ruled invalid in 2015 by the EU Court of Justice, because they said there were not sufficient safeguards for personal data under the voluntary scheme.

The new agreement is intended to protect the privacy of EU citizens when their personal information is processed in the US.

Companies will be able to sign up to the EU – US Privacy Shield from August 1st once they have implemented any necessary changes to comply with the strict compliance obligations.

The EU – US Privacy Shield is based on a system of self-certification by which US organisations commit to a set of privacy principles entitled the EU – US Privacy Shield Framework Principles.

The new framework was unveiled in February and has been under review since then. Back in June the European Data Protection Supervisor, Giovanni Buttarelli advised that it ‘needed significant improvements’ because it was not ‘robust enough’ and that the Commission should negotiate improvements to the Privacy Shield in three main areas:

limiting exemptions to its provisions;

improving its redress and oversight mechanisms,

integrating all the main EU data protection principles.

For the Privacy Shield to be an effective improvement on Safe Harbour it must provide adequate protection against indiscriminate surveillance as well as obligations on transparency, and data protection rights for people in the EU.

In Brussels on July 12th Věra Jourová, Commissioner for Justice, Consumers and Gender Equality said: “The EU – US Privacy Shield is a robust new system to protect the personal data of Europeans and ensure legal certainty for businesses. It brings stronger data protection standards that are better enforced, safeguards on government access, and easier redress for individuals in case of complaints”

In summary the EU-US Privacy Shield is based on the following principles:

Strong obligations on Companies handling data and robust enforcement

Clear safeguards and transparency obligations on US government access

Effective protection of individual rights

Annual joint review mechanism

Easier and cheaper redress possibilities in case of complaints —directly or with the help of the local Data Protection Authority

The Privacy Shield agreement applies to both data controllers and processors (agents), and specifies that processors must be contractually bound to act only on instructions from the EU controller and assist the latter in responding to individuals exercising their rights under the Principles.

Whilst the UK remains a member of the EU (which it will be for least the next 2 years) UK based companies that process data in the US will be able to use the Privacy Shield where appropriate.