Effective January 1, 2020, the California Consumer Privacy Act (CCPA) will impose burdensome GDPR-like transparency and individual rights requirements on almost every company that handles “personal information” regarding California residents, regardless of where the business is based. The Act will impact information regarding not only consumers, but also employees and business contacts.

Join us for a webinar on June 4, 2019, at 3p EST when Elliot Golding and Ivan Rothman will provide an overview of the CCPA and discuss the act’s:

Scope and applicability (e.g., what companies, data and processes will be impacted)

Key requirements (e.g., privacy statement, individual rights, etc.)

Contextual comparisons to existing US law and GDPR

Suggested steps to build a CCPA compliance program efficiently and effectively

Attendees will have the opportunity to ask questions during the program, with a full Q&A session to follow.

If you would like to attend, or have colleagues who would, please register any interested parties.

]]>https://www.securityprivacybytes.com/2019/05/did-you-miss-our-recent-ccpa-webinar-understanding-and-preparing-for-the-california-consumer-privacy-act/feed/0squiresanders@lexblogauthors.comhttps://www.securityprivacybytes.com/2019/05/did-you-miss-our-recent-ccpa-webinar-understanding-and-preparing-for-the-california-consumer-privacy-act/No More Games! The CNIL Publishes its 2018 and 2019 Activity Reporthttp://feeds.lexblog.com/~r/SecurityPrivacyBytes/~3/yYNoqXS86fI/
https://www.securityprivacybytes.com/2019/05/no-more-games-the-cnil-publishes-its-2018-and-2019-activity-report/#respondWed, 15 May 2019 00:08:59 +0000https://www.securityprivacybytes.com/?p=1873Continue Reading]]>The CNIL blows the whistle for the end of the transition period. For the first time, the CNIL’s 2019 investigation program is not specific to an industry and potentially impacts controllers and processors throughout all sectors of business. Going forward, the CNIL will also be more thorough and less lenient.

2019 Program

Investigation program

CNIL’s yearly investigation programs account for approximately one quarter of its investigations. This year’s program will focus on three major areas:

The complaints it receives (either collective or individual). These complaints, or the exercise of data subjects rights, represented about 73.8 % of all complaints received in 2018.

The sharing of responsibilities between processors and subcontractors, which is a cross-sector topic.

The data of children (including what data is collected, i.e., photos, biometric data and CCTV in schools, as well as parental consent for children under 15).

As in previous years, the CNIL will also:

Investigate complaints and the reports sent to the CNIL.

Follow up on past procedures.

Gather information from various news sources.

Finally, the CNIL will continue the cooperation initiated in 2018 with its other national Supervisory authorities, such as joint control operations.

Sanction policy

The CNIL treated 2018 as a transition period, allowing data controllers to understand and progressively assimilate the requirements of the GDPR.

From 2019 onwards, the CNIL will investigate compliance more thoroughly (including impact analysis, data portability, maintenance of a register of processing and of data breaches, etc.) and draw on, if necessary, all the consequences in case of gaps. It will nevertheless continue to assess, on a case-by-case basis, the most appropriate sanction. This will depend on the gravity of the breaches and the good faith of the organization and its cooperation. In January 2019 the CNIL issued a €50 million fine to a major tech company for alleged GDPR violations.

2018 Report

Complaints and data breach notifications

In 2018, the CNIL registered 1,170 data breach notifications. It received a record of 11,077 complaints, which represents a 32.5 % increase compared to 2017. About 20% of these complaints fell under the GDPR cooperation program with other Supervisory Authorities.

These complaints primarily related to the publication of data on the internet (373 requests for delisting). Individuals massively requested their data to be deleted from the internet (names, contact details, comments, photographs, videos, accounts, etc.). These kinds of complaints reveal how difficult it can be for individuals to manage their digital life, and in particular, their online reputation.

Investigations

In 2018 the CNIL carried out over 300 investigations which consisted of onsite, online, document requests and hearings.

The following is a breakdown of sources that triggered the investigation.

Formal notice to remedy

In most cases, the notices issued by CNIL resulted in the organizations remedying the identified compliance gap(s). Formal notices to remedy are not considered sanctions per say as they are issued before an actual “fair trial” procedure. Forty-nine formal notices to remedy were adopted in 2018 out of which 13 were publicized. Two sectors, in particular, were targeted.

Insurance (5) for the use of insurance data for marketing purposes without legal basis

The ICO has issued a penalty notice to over 100 organisations for failing to pay their data protection fee. Failing to pay this fee due to an innocent mistake may not be accepted as a viable excuse, as demonstrated by the recent judgement in Farrow & Ball Limited v The Information Commissioner (Dismissed) [2019] UKFTT 2018_0269 (GRC).

Under the Data Protection (Charges and Information) Regulations 2018, UK organisations are required to pay the ICO an annual data protection fee unless they are exempt. The fee payable depends on the tier of the organisation, and ranges from £40 to £2,900.

In December 2018, we published an update explaining that the ICO had sent over nine hundred letters of intent to organisations that had not paid their fee, and that it intended to issue penalty notices and fines where necessary.

In 2018, 103 such notices were issued by the ICO. This included a £4,000 penalty notice issued against paint and wallpaper company Farrow & Ball Limited for its failure to pay the £2,900 data protection fee required of a tier 3 organisation.

Farrow & Ball appealed this penalty notice at the First Tier Tribunal (Information Rights) in March 2019 on the basis that its failure to pay was an innocent mistake, as:

the ICO’s reminder was sent when the relevant individual was on holiday;

the reminder was not identified as important internally;

Farrow & Ball contacted the ICO promptly once the failure was identified and paid the fee immediately; and

Farrow & Ball has put procedures in place to ensure it will not happen again.

However, in its first ruling of an appeal against such a notice, the Tribunal held that Farrow & Ball did not have a reasonable excuse for non-compliance as a “reasonable controller” would have systems in place to comply with the 2018 Regulations and there was no particular difficulty or misfortune which explained Farrow & Ball’s departure from the expected standards of a “reasonable controller”. There was no evidence of financial hardship or other reason for the ICO’s discretion to be exercised differently. The penalty notice was therefore upheld.

The outcome of this appeal serves as a reminder of the importance of complying with the 2018 Regulations. Subject to any further appeal by Farrow & Ball, it is likely that a similar approach will be taken by the Tribunal in any appeals brought in future.

The ICO has also recently published a blog explaining why businesses that process personal data need to pay the data protection fee. Paul Arnold, ICO Deputy Chief Executive, suggests that in addition to it being the law to pay the fee, it also makes good business sense and has an impact on an organisation’s reputation. It tells your customers that you care about and protect their information, and offers reassurance to other organisations doing business with you.

Organisations can pay the data protection fee on the ICO’s website, which takes about 15 minutes to complete. If you are not sure whether you need to pay a fee to the ICO, you can use the ICO’s self-assessment tool to find out.

]]>https://www.securityprivacybytes.com/2019/05/have-you-paid-your-data-protection-fee-2/feed/0charlotte.lister@squirepb.comhttps://www.securityprivacybytes.com/2019/05/have-you-paid-your-data-protection-fee-2/The Un-healthiness of the Australian Health Sector’s Data Securityhttp://feeds.lexblog.com/~r/SecurityPrivacyBytes/~3/hpSaw2A9QAE/
https://www.securityprivacybytes.com/2019/05/the-un-healthiness-of-the-australian-health-sectors-data-security/#respondFri, 03 May 2019 05:06:13 +0000https://www.securityprivacybytes.com/?p=1847Continue Reading]]>More than twelve months after the commencement of the Australian Notifiable Data Breach Scheme,[1] statistics published by the Office of the Australian Information Commissioner (OAIC) have begun to reveal trends present in the 812 notifiable data breaches recorded in Australia between 22 February and 31 December 2018. One key trend is the clear susceptibility of the health care industry, which suffered one fifth of all data breaches recorded in Australia throughout 2018, the highest number on an industry scale.

There is a cruel sense of irony that the services we turn to when we are vulnerable are themselves vulnerable, suffering data breaches that may harm us financially, psychologically or, in extreme circumstances, physically. The figures are stark, with 163 notifiable data breaches suffered by health sector businesses that are subject to the federal Privacy Act 1988 (Cth), which does not include the country’s major hospitals operated under State jurisdictions. On top of these figures, the Australian Digital Health Agency, the agency responsible for administering the controversial ‘My Health Record’ system,[2] reported that a further 42 data breaches affected Australian My Health Records throughout 2018, which are also excluded from the statistics recorded in the OAIC’s reports.

For industries in the health sector, and those advising on cyber security, the question inevitably arising out of these figures is – why? Are these statistics merely the result of statistical variation over a limited period, or are there industry-specific factors that contribute to the prevalence of data breaches? This question cannot be answered definitively, but there are statistical anomalies within health sector data breach figures which provide further insight. In the period between 1 April 2018 and 31 December 2018 there were 83 notifiable data breaches in the health sector caused by human error, comprising 56% of the total breaches throughout that period.[3] This figure is alarmingly high. In contrast, the percentage of data breaches caused by human error in all other industries is a mere 30%.[4]

The OAIC’s quarterly statistic reports delve into further detail on the context of these breaches, assigning each human error data breach to a general category of the circumstance of its occurrence. These statistics indicate that the most common way in which human error data breaches occur include:

There are various hypotheses regarding why these data breaches occur more frequently in the health sector than other industries. Some propose that the industry is comprised of a lesser proportion of ‘digital natives’ than other industries due to the generally older age demographic of employees in the industry. Other potential explanations are that there are embedded virtues of trust and compassion in the health industry that may lead employees to be more susceptible to fraud or less aware of risks. Additionally, high-pressure working conditions may also play a part. Regardless of the potential reasons behind these trends, the health sector must improve its internal data security standards or risk continuing to suffer data breaches at a rate greater than any other industry.

Promisingly, the statistics and trends discussed above indicate that there is scope for improvement via relatively simple avenues. The human errors that cause the majority of data breaches usually involve a simple lack of attention to detail, such as confirming correct address recipients and ensuring security of physical files. Businesses can go a significant way towards addressing the industry’s shortcomings through greater awareness and personnel training.

To avoid becoming another statistic, healthcare providers must be cognisant of the unique risks associated with their industry and take simple steps to reduce the risk of a data breach.

[1] For further information regarding the operation of the Notifiable Data Breach Scheme in Australia generally please refer to our earlier client alert.

[2] Established under the My Health Records Act 2012 (Cth), the My Health Record system is an online system that compiles participants’ health records over time and allows approved health service providers to access those records when treating patients, providing greater patient flexibility in the health industry.

[3] Please note that industry-by-industry figures are unavailable for the first quarter of 2018.

[4] Being 182 human error data breaches out of a total of 601 in all other industries, including finance, professional services and education.

On Wednesday, April 24, 2019, the new data protection legislation was published in the Czech Collection of Laws and became effective. In doing so, the Czech Republic remedied its legislative deficiency, as it was one of the last EU states lacking the data protection adaptation legislation. (The overview of the current state of GDPR implementation in the Member States can be found here).

The Czech national implementation consists of two acts reflecting both the General Data Protection Regulation (GDPR) and the data protection Directive (EU) 2016/680 for Police and Criminal justice authorities. The first one is the Act on Personal Data Processing (PDPA)[1], replacing the previous data protection legislation from the year 2000. The second is the accompanying act[2] adjusting other 39 related legal acts such as the Criminal Procedure Code, the Anti Money Laundering Act, the Freedom of Information Act and others.

Besides the enactment of some specification or extensions of provisions already established by the GDPR, the Czech legislation makes only limited use of number of derogation options given by the GDPR for the national discretionary power. Below we bring an overview of the most significant provisions:

Exemption from the obligation to assess compatibility of further processing with the initial purpose of data collection [Art. 6 (4) GDPR] – The exemption from the compatibility assessment applies in case of the further data processing necessary for securing a ‘protected interest’ such as national defense and security interests, prevention and detection of criminal offenses or regulated ethical rules, protection of the rights and freedoms of persons or enforcement of private claims. In addition, in case of securing a ‘protected interest’ some rights of data subjects as well as data breach notification obligation may be limited.

Age limit for the child’s consent in relation to information society services [Art. 8 GDPR] – The minors from the age of 15 may grant a consent to the processing of his or her personal data for the purpose of providing information society services (e.g. social networks, online games or e-mail services) without the consent of the legal guardian.

Possibility to fulfil the information obligation in a manner allowing distant access [Art. 13 and 14 GDPR] – In case of data processing based on legal obligation or in public interest, the controller may inform the data subjects by merely publishing the information in a manner allowing distant access (without informing each data subject separately).

Notification of the personal data changes by register adjustment [Art. 19] – Another simplification is the possibility of the controller to communicate any rectification or erasure of personal data or restriction of processing to recipients by changing personal data in the register provided the updated register is regularly made available to the recipient.

Limitation of the obligation to carry out the data protection impact assessment (DPIA) [Art. 35 GDPR] – In situations where the data processing is ordered directly by law, the controller is not required to carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. In addition, the Czech data protection supervisory authority (Úřad pro ochranu osobních údajů (ÚOOÚ); The Office for Personal Data Protection) has published the list of the processing operations subject to the requirement of the DPIA.[3]

Processing for scientific or historical research or statistical purposes [Art. 89] and processing for journalistic purposes or the purpose of academic artistic or literary expression [Art. 86] – The PDPA provides for some exemptions and derogations in case of processing for the mentioned purposes, such as limitation from the information obligation or to the right to object.

Exemption from imposing administrative fines and penalties for public authorities and bodies [Art. 82 – 84 GDPR] – On of the most controversial provisions of the new act is the provision stipulating that the data protection authority shall refrain from imposing an administrative fine on public authorities and bodies. Unlike other data controllers, public authorities and bodies will not be fined in cases of an infringement of the GDPR or the Czech data protection legislation – despite the fact that some of them process massive amounts of personal data, including sensitive data and were in the past, repeatedly in breach of the data protection rules.

Further, the PDPA again declares The Office for Personal Data Protection (ÚOOÚ) to be the supervisory authority for the data protection matters, newly determines its structure and entrust it with various powers including the right to conduct inspections, on-site audits, impose fines, etc.

The newly adopted acts bring an end to the uncertainty in the area of the data protection in the Czech Republic caused by the lack of implementation legislation. With the first cases of data breaches coming, we will see how the data protection authority handles its re-establish role and if it will continue in the current approach guided by the principle of reparation rather than immediate repression. Instead of initiating sanction proceedings (in case of minor controllers and/or isolated and minor offenses), it informed the controller and called on him to remedy the defective processing.

[2] Act No. 111/2019 Coll., amending certain acts in connection with the adoption of the Act on Personal Data Processing; available in Czech at https://aplikace.mvcr.cz/sbirka-zakonu/ViewFile.aspx?type=c&id=38632

The European Data Protection Board (EDPB) has published draft guidelines on the “processing of personal data under the contractual legal basis in the context of the provision of online services to data subjects”. These guidelines are currently open to consultation.

Scope of the Guidelines: Agreements for Online Services

The guidelines relate to a specific category of agreements, meaning those under which data subjects are provided “online services”, or access to platforms that do not require a direct payment from the users but are financed by targeted advertising instead.

Choosing the Relevant Legal Basis

In relation to such services, the most obvious legal basis would be consent, legitimate interest or contract, the latter being the subject matter of the guidelines.

Article 6(1) (b) GDPR provides a lawful basis for the processing of personal data to the extent that “processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract” as a legal basis for processing.

This legal basis has the advantage of not giving rise to a data subject’s right to withdraw consent or to object to the processing. It does, however, trigger the right to portability.

The purpose of the guidelines is to set out the boundaries of the legal basis with a view to fighting against any temptation to make the contract a “catch-all” legal basis for a very extensive set of processing activities.

The Necessity Test

For EDPB, the essential question that a controller has to address is: “Is the processing of data genuinely and objectively necessary for the performance of the contract/or in order to take pre-contractual steps at the request of a data subject?”

According to EDBP:

“What is ‘necessary for the performance of a contract’ is not simply an assessment of what is permitted by or written into the terms of a contract. The concept of necessity has an independent meaning in European Union law, which must reflect the objectives of data protection law.”

“If there are realistic, less intrusive alternatives, the processing is not ‘necessary’. Article 6(1)(b) will not cover processing which is useful but not objectively necessary for performing the contractual service or for taking relevant pre-contractual steps at the request of the data subject, even if it is necessary for the controller’s other business purposes.”

Necessity goes beyond a mere contractual condition. There is a “distinction between processing activities necessary for the performance of a contract, and terms making the service conditional on certain processing activities that are not in fact necessary for the performance of the contract. ‘Necessary for performance’ clearly requires something more than a contractual condition.”

“Unsolicited marketing or other processing carried out solely on the initiative of the data controller or at the request of a third party” do not amount to “pre-contractual steps at the request of the data subject.”

“A controller may wish to bundle several separate services or elements of a service with different fundamental purposes, features or rationale into one contract. This may create a ‘take it or leave it’ situation for data subjects who may only be interested in one of the services”. “Where the contract consists of several separate services or elements of a service that can in fact reasonably be performed independently of one another, the question arises to which extent Article 6(1)(b) can serve as a legal basis”.

After the end of the contract, the processing will “no longer be necessary for the performance of that contract and thus the controller will need to stop processing”. In principle data should not be used even with another legal basis as “it is generally unfair to swap to a new legal basis when the original basis ceases to exist.” There are, however, permitted exceptions, such as compliance with law, or exercise or defence of legal claims.

EDBP’s Guidance Questions and Examples

An assessment should be made before the start of the processing activity, based on the following questions:

“What is the nature of the service being provided to the data subject? What are its distinguishing characteristics?

What is the exact rationale of the contract (i.e. its substance and fundamental object)?

What are the essential elements of the contract?

What are the mutual perspectives and expectations of the parties to the contract? How is the service promoted or advertised to the data subject? Would an ordinary user of the service reasonably expect that, considering the nature of the service, the envisaged processing will take place in order to perform the contract to which they are a party?”

Article 6(1)(b) is unlikely to be a justifiable legal basis for the following processing activities: service improvement, fraud prevention, and online behavioural advertising.

“The EDPB acknowledges that personalisation of content may (but does not always) constitute an essential or expected element of certain online services, and therefore may be regarded as necessary for the performance of the contract with the service user in some cases.” This will depend on the nature of the service provided, the expectations of the average data subject and whether the service can be provided without personalisation.

What Next?

It is important that stakeholders review these guidelines carefully and submit their views or arguments in the consultation process, where necessary, by 24 May 2019. These guidelines notably seem to object to a digital agreement where services are exchanged for personal data. Moreover, these guidelines, even though restricted to an online agreement, can also be applied more generally to many other situations where Article 6(1)(b) is used as a legal basis in the offline world. We can assist you in making any such submission.

]]>https://www.securityprivacybytes.com/2019/04/edpb-guidelines-on-contract-as-a-legal-basis-for-processing-no-hotchpotch-allowed/feed/0stephanie.faber@squirepb.com, antoine.sullice@squirepb.comhttps://www.securityprivacybytes.com/2019/04/edpb-guidelines-on-contract-as-a-legal-basis-for-processing-no-hotchpotch-allowed/Join Us– Webinar: Understanding and Preparing for the California Consumer Privacy Acthttp://feeds.lexblog.com/~r/SecurityPrivacyBytes/~3/2RT_oqDmHFY/
https://www.securityprivacybytes.com/2019/04/join-us-webinar-understanding-and-preparing-for-the-california-consumer-privacy-act/#respondWed, 17 Apr 2019 18:53:41 +0000https://www.securityprivacybytes.com/?p=1829Continue Reading]]>Effective January 1, 2020, the California Consumer Privacy Act (CCPA) will impose burdensome GDPR-like transparency and individual rights requirements on almost every company that handles “personal information” regarding California residents, regardless of where the business is based. The Act will impact information regarding not only consumers, but also employees and business contacts.

Attendees will have the opportunity to ask questions during the program, with a full Q&A session to follow.

If you would like to attend, or have colleagues who would, please register any interested parties.

]]>https://www.securityprivacybytes.com/2019/04/join-us-webinar-understanding-and-preparing-for-the-california-consumer-privacy-act/feed/0squiresanders@lexblogauthors.comhttps://www.securityprivacybytes.com/2019/04/join-us-webinar-understanding-and-preparing-for-the-california-consumer-privacy-act/Could a Federal Data Privacy Law be a Reality in 2019?http://feeds.lexblog.com/~r/SecurityPrivacyBytes/~3/tFKuvAMxCt0/
https://www.securityprivacybytes.com/2019/03/could-a-federal-data-privacy-law-be-a-reality-in-2019/#respondFri, 29 Mar 2019 05:10:18 +0000https://www.securityprivacybytes.com/?p=1824Continue Reading]]>From the continual evolution of the California Consumer Protection Act (CCPA) to the potential ramifications of a Brexit “no-deal” on data transfers, 2019 may be a defining point in data privacy and cybersecurity. Nowhere is this increased attention more pronounced than the growing support for US federal data privacy legislation.

With growing public concern regarding the protection of personal data, primarily related to recent massive data breaches and the emergence of increasingly sophisticated technologies, but also spurred on by regulators in other jurisdictions, most notably the European Union, it is quite possible the United States Congress will pursue federal data privacy Legislation in the 116th Congress. Proponents argue the current system of US data privacy laws, with each state having its own data privacy laws (a sectoral approach), is confusing and yields inconsistent application. For example, under the same set of circumstances pertaining to a data breach, some states’ laws may trigger breach notification obligations whereas other states’ laws would not require notification. Accordingly, any company contending with a nationwide data breach event must review, confirm and attend to compliance obligations for all 50 states, with potentially many different compliance obligations. Proponents believe a federal policy designating a single standard would simplify compliance and lessen confusion. As evidence that a uniform data protection law can support needed certainty, supporters point to the European Union’s General Data Protection Regulation (GDPR).

Recognizing public attention and concern over data privacy, the House and Senate Commerce and Judiciary Committees have held hearings on topics related to consumer data privacy and requested input from industry experts. In September and October of 2018, Senator John Thune (R-SD), the then-chairman of the Senate Committee on Commerce, Science, and Transportation, held hearings on safeguards related to consumer data privacy.

In November 2018, Senator Ron Wyden (D-OR) released a draft Consumer Data Protection Act, which was drafted to expand the Federal Trade Commission’s (FTC) regulatory and enforcement powers to, among other things, establish minimum national data privacy and cybersecurity standards. The draft would also create a system that would allow consumers stop third parties from tracking online activity and sharing data. Notably, Sen. Wyden’s legislation included provisions that would allow criminal penalties for senior executives whose companies ran afoul of the law.

Shortly thereafter, Senator Brian Schatz (D-HI) released the draft Data Care Act, which would require website, applications, and other online providers to establish practices to reasonably secure individual identifying data and promptly inform users of data breaches that involve sensitive information. Pursuant to the proposed Act, these online providers would be subject to duties of “care, loyalty, and confidentiality” in the handling of personal data. The proposed Act would also grant the FTC enhanced power, including rule-making power, to implement the Act as a violation of the Act would be treated as a violation of an FTC rule with fine authority.

Recently, on January 16, 2019, Senator Marco Rubio (R-FL) announced a new privacy bill, the American Data Dissemination Act (ADDA). In contrast to the Consumer Data Protection Act and the Data Care Act, ADDA does not expand FTC authority to create and implement laws. Instead, ADDA would require Congress to pass applicable laws presented by the FTC, with the FTC ultimately gaining rule-making power if Congress is unable to pass a law within two years of ADDA going into effect. Most notably, the Act would supersede state privacy regulations, which could result in a greater emphasis on privacy rights at the federal level.

Not only are members of Congress increasingly interested in monitoring and proposing federal data privacy legislation, members of the private sector are interested as well. For example, on January 14, the Information Technology & Innovation Foundation (ITIF), a technology think-tank (supported by Amazon and Google) proposed a “grand bargain” proposal on federal data privacy legislation. This plan supports a single breach standard and would preempt and state laws. Further evidencing private sector support, Intel also disseminated a draft of a data privacy bill.

On February 27, 2019, the Senate Commerce Committee held a hearing, titled “Policy Principles for a Federal Data Privacy Framework,” to examine what Congress should do to address risks to consumers and implement data protections for all Americans. The hearing included robust debates surrounding transparency pertaining to the use of consumers’ personal information and the ability of consumers to control how companies use their personal information. We discussed this hearing in an earlier blog post.

With the House controlled by Democrats and the Senate under Republican control, it is difficult to determine what shape federal data privacy legislation will take in the in the 116th Congress, and indeed whether any progress will be made. Nonetheless, it appears this will be a very active area in the coming months, and with our sophisticated Data Privacy and Cybersecurity Practice and leading Public Policy practice, Squire Patton Boggs is situated to monitor and advise on this developing and all-important area of legislation.

]]>https://www.securityprivacybytes.com/2019/03/could-a-federal-data-privacy-law-be-a-reality-in-2019/feed/0shalin.sood@squirepb.comhttps://www.securityprivacybytes.com/2019/03/could-a-federal-data-privacy-law-be-a-reality-in-2019/Can Police Require Individuals to Unlock Their Smartphones?http://feeds.lexblog.com/~r/SecurityPrivacyBytes/~3/pMh65s920LE/
https://www.securityprivacybytes.com/2019/03/can-police-require-individuals-to-unlock-their-smartphones/#respondWed, 27 Mar 2019 05:25:55 +0000https://www.securityprivacybytes.com/?p=1818Continue Reading]]>Recently Chase Goldstein and Thomas Zeno contributed to our Anticorruption Blog. Their article reviews whether police can force individuals to unlock their smartphones. To unlock or not to unlock? Different rules apply depending on where you are located, as the states of Massachusetts and have seen conflicting rulings. There is also an international dimension, illustrated by a recent decision from Israel. In short, travelers must beware.

]]>https://www.securityprivacybytes.com/2019/03/can-police-require-individuals-to-unlock-their-smartphones/feed/0squiresanders@lexblogauthors.comhttps://www.securityprivacybytes.com/2019/03/can-police-require-individuals-to-unlock-their-smartphones/European Commission Announces Provisional Agreement on Whistleblower Directivehttp://feeds.lexblog.com/~r/SecurityPrivacyBytes/~3/CtIi7_MklJc/
https://www.securityprivacybytes.com/2019/03/european-commission-announces-provisional-agreement-on-whistleblower-directive/#respondThu, 21 Mar 2019 16:02:37 +0000https://www.securityprivacybytes.com/?p=1809Continue Reading]]>In a press release published on March 12, 2019, the European Parliament and its member states reached a provisional agreement on new rules that will guarantee a high level of protection for whistleblowers who report breaches of EU law. The draft establishes a three-tier reporting system (that potentially allows the whistleblower to inform publicly or through media the information) and robust measures against potential retaliation.

Persons Concerned

The European Commission defines a whistleblower as “someone reporting or disclosing information on violations of EU law which they observe in their work-related activities.” That means it covers employees but also self-employed people, freelancers, consultants, contractors, suppliers, volunteers, unpaid trainees and job applicants. This definition is less restrictive than the one provided in Article 6 of the French “Sapin 2” Law, which requires a “clear and serious” violation to be reported.

To avoid penalising people who act in good faith, whistleblowers also qualify for protection, if they had reasonable grounds to believe that the information reported was true at the time of reporting, or if they have serious suspicions that they observed an illegal activity

Areas Covered

This regulation was proposed in April 2018 by the European Commission and constitutes important provisions in favor of whistleblowers. The draft regulation grants protection in numerous areas of EU regulation:

Public procurement

Financial services

Money laundering and terrorist financing

Product safety

Transport safety

Environmental protection

Nuclear safety

Food and feed safety

Animal health and welfare

Public health

Consumer protection

Privacy

Data protection and security of network and information systems

It also applies to breaches of EU competition rules, violations and abuse of corporate tax rules and damage to the EU’s financial interests. Moreover, member states are free to extend these rules to other areas.

Establishment of a Three-tier Reporting System

Obligation for Public and Private Sector to Establish Internal Channels and Procedures

All companies with more than 50 employees or with an annual turnover of over €10 million will have to set up an internal procedure to handle whistleblowers’ reports. All states, regional administrations and municipalities with more than 10,000 inhabitants will also be covered by the new law. SMEs are exempted from this obligation, with the exception of the companies operating in the field of financial services or vulnerable to anti-money laundering or counter terrorist financing. These requirements are quite the same as those provided in Article 8 of the French Sapin 2 Law.

They also need to designate a person or a department responsible for receiving and following up on the reports and to provide clear and accessible information about those procedures and the conditions under which reports can be made externally to national or EU competent authorities. After the whistleblower has submitted a report, the designated person/department must follow up on the report within three months and provide feedback to the reporting person about this follow up. In France, the Sapin 2 Law does not require any deadline before seizing competent national authorities but only a “reasonable period.”

Obligation for Competent National Authorities to Establish External Reporting Channels

The whistleblower can report information to a national authority if internal channels do not work or could not reasonably be expected to work (for example, where the use of internal channels could jeopardise the effectiveness of investigative actions by the authorities responsible).

Member states must identify the authorities who will be charged with receiving and following-up on reports about breaches under the new law. These authorities will be under obligation to follow-up on the reports received, and, within three months (extendable to six months in case of complex cases), give feedback to the reporting persons about the follow-up. In France, Article 8 of the Sapin 2 Law entrusts the “Rights Defender” to take in charge the whistleblower.

Public Reporting

In cases where internal and/or external channels do not function or could not reasonably be expected to function properly whistleblowers may make a public disclosure including to the media. For instance, when it is reasonable to suspect a collusion between the perpetrator of the crime and the state authorities responsible for prosecuting them, or in cases of urgent or grave danger for the public interest, or risk of irreversible damage. This will protect whistleblowers when they act as sources for investigative journalism.

Prevention of Retaliation and Effective Protection

All forms of retaliation are forbidden and should be sanctioned. If a whistleblower suffers retaliation (harassment, demotion, dismissal, etc…), they should have access to free advice and adequate remedies. The burden of proof will be reversed in such cases, so that the person or organization must prove that they are not acting in retaliation against the whistleblower. Whistleblowers will also be protected in judicial proceedings; in particular, through an exemption from liability for disclosing the information

Past and Future of the Directive

After the Assange and Snowden revelations in 2010 and 2013, European stakeholders understood the importance of protecting whistleblowers who endanger themselves by publicly exposing occult abuses from public authorities or private companies. This led to a 2014 Council of Europe Recommendation on the Protection of whistleblowers. This document proposed guidelines, which were later developed, on April 23, by the European Commission in a draft Directive.

The latter will now have to be formally approved by both the European Parliament and the Council.