The IPKat

Passionate about IP! Since June 2003 the IPKat has covered copyright, patent, trade mark, designs, info-tech, privacy and confidentiality issues from a mainly UK and European perspective. Read, post comments and participate!

On the evening of the first day of the ChIPs Global Summit, the AmeriKat surrounded herself with some more inspiring women, in the form of rising IP star, Guinevere Jobson (Fenwick & West), her mom and the AmeriKat's mom. Over cheese and wine, two generations of women discussed how life in the workplace and expectations have changed for women and how balance can (if ever) be struck. The issue of balance was also at the heart of the following morning's panel session on the tension between privacy and security. The tension came to the forefront following the FBI's ex parte order served on Apple following the San Bernardino terrorist attack (see background here) requesting that Apple create new software that would allow the FBI to unlock an iPhone 5c belonging to one of the terrorists. Law enforcement agencies are concerned with protecting national security, whereas companies are concerned with losing the confidence of their customers in protecting their privacy.

The debate highlighted the detailed and subtle nature of the tension between numerous stakeholders and corporate and law enforcement objectives. Although the panelists dove into the detail of the legislative and policy debate over 90 minutes, the AmeriKat has summarized the top eight themes emerging from the session for readers as follows:

1. The privacy v security fallacy. National security and personal privacy are not at odds with each other. It is a false choice - it is really security v security. As Noreen highlighted 17.5 million people were victims of direct hacks which violated their personal privacy, but importantly they were also victims of crime. Privacy is a security issue and security is a privacy issue. People who put products into the marketplace want to stop crime at the onset and law enforcement want to stop it once it happens. The argument is that that you cannot undermine encryption to protect customer's security and privacy in the interests of law enforcement because in doing so you will inevitably create back doors which allow "the bad guys" in resulting in crime (and, therefore, threats to security).

2. Security and privacy have taken center stage. The FBI's order against Apple was ex parte. There was no opportunity for Apple to be heard. The manner in which the FBI obtained the order (after months of working whit Apple and not under seal, so that it was public) therefore generated controversy and ignited fights between various factions on the public stage. Noreen explained that Apple was and had been cooperating with law enforcement for months until the February ex parte order. The FBI's order asked Apple to write a new operating system so that they could gain access to the iPhone 5c at issue which was running on iOS 9 (reportedly known internally at Apple as GovtOS). If Apple complied with the order they would risk the security of other Apple customers. The panel recommended this TIME article interview with Tim Cook to the audience for background reading. The panel appreciated FBI Director Comey's efforts to keep the issue in the forefront of public discussion, but some panelists stated that his comment that an "adult conversation" was needed in the wake of the controversy was probably an unfortunate choice of words as the insinuation was that if you did not agree with the FBI, you were not an adult. Now that the "fervor" has died down, many panelists felt it was time to reignite the conversation.

Is the privacy v security debate really a fallacy?

3. Protecting customers' security. Customers demand secure devices and services. The counter is that if we make devices secure we lose the ability to obtain information. What really is at stake, commented Erika, was the ability to obtain information. There may be a perception from law enforcement that you are trying to create an impenetrable device - but that is impossible. These are human-developed systems. All that companies, like Apple, are trying to do is to stay ahead of the "bad guys" who want to gain access to customers' data. Humans will always be able to get into devices (indeed, the AmeriKat notes, the FBI found a way into the phone with the assistance of a third party - see Washington Post article here). Erin explained that "for the trust of the people who use our products, we have to strengthen our security. If we are not [permitted to provide] end-to-end encryption, there are numerous people outside of the jurisdictional reach of the US who can and people will use those services." In those circumstances, when law enforcement wants information they are at a disadvantage and will not be able to obtain metadata. US companies and law enforcement will then both be at a competitive disadvantage.

4. Judiciary is getting this right? Some panelists noted that the judiciary seems to be getting the balance between privacy and security right in the few instances where it has had to rule. Just because technology has expanded into every sector of our life, it does not mean that the role of federal government in our lives is automatically expanded. The Supreme Court has held firm that there is a strong boundary between privacy and security, upholding the strength of the Fourth Amendment with limitations (Riley v California) . The Courts also recognize the limits to how much information is actually held on devices and that 90% of people who have phones have some sort of digital information about themselves contained therein. In the absence of a plethora of court cases, it was noted that a more robust debate on these issues was needed.

5. Mixed messages on a global scale and the shadow of data localization. The panel noted that there is real tension between law enforcement and data protection authorities. When working with data protection authorities around the world, some authorities do not want you to retain data and want you to get rid of it. However, law enforcement in these same countries want you to keep it. Some countries demand strong encryption on devices. Other countries require easier access to data for law enforcement. There is a perpetual conflict that companies have to navigate. Noreen and Erin stated they, like many companies, are engaged in conversations about this conflict, but data protection authorities need to be speaking directly with law enforcement and security so that commerce do not necessarily find themselves in the middle. The panel explained that there are countries around the world who are investigating crimes who want access to data from US companies about activities being undertaken in their country. The current method to obtain this information is via the Mutual Legal Assistance Treaty, but this is a long and cumbersome process which is not always fit for purpose in a digital age. Stakeholders are looking to fix the MLAT so it is not so cumbersome. However, in response to this problem, some countries are enacting laws to localize data storage; there are approximately 40 laws in the works around the world on data localization. Data localization can be a concern as there are some regimes who are not providing for localized storage for legitimate aims. Noreen commented that she understands why countries ask for data localization in order to enhance the tools available for local law enforcement. However, from a company perspective companies are concerned about their customer's privacy and security and experience. Data localization can also increase costs and may also impact quality of services.

6. Concern for European start-ups in light of the GDPR. The panel also noted that the US and UK are working together to share data - it is proposed that US companies holding data can share that data with UK authorities and vice versa. The proposal before Congress involving the US and UK was said by a member of the panel to "really lift restrictions from companies for [the sharing of] different types of data." The proposal involves collaborative measures between law enforcement and service providers (see summary of the legislative proposal here) and was commented that the sharing would be similar to the information sharing provisions under the Cybersecurity Information Sharing Act which came into effect last year. Companies on the panel also explained that they are spending a lot of time on the new General Data Protection Regulation (GDPR) in Europe. They are in the process of combing through the GDPR's provisions. Some on the panel felt that there was a "paternalistic view" of the authorities knowing what is best for the public, as against the notion of the "self" and people being in control of their own data. However, as companies "we have responsibilities to make sure people understand what is going on and how they can exercise control" with respect to their data, explained Erin noting in particular Facebook's privacy health check-up. Customers are at the center of this process. Facebook is constantly innovating to find ways to ensure people understand their rights and privacy options. There was concern expressed in the panel about how start-ups could ever get off the ground in Europe in light of the GDPR as even sophisticated companies need a team of 15 lawyers to identify their obligations on the legislation.

7. The consequence of cheaper and easier data storage. With easier and cheaper data storage (see in particular cloud storage), companies are able to identify customer trends over a long period of time. Companies use this data to predict their customer's needs (see your customer experience on Amazon). The amount of data stored on customers - be it on Facebook or Amazon - is reaching 20 years. Think about what has happened in your life in 20 years. Graduating college, getting married, having kids. The data about you is enough for a family history. This creates a luxurious data pool upon which to conduct sales and trend analysis to recommend products to you (see for example, the story of Target predicting that a teenage girl was pregnant before her father knew - here and here). As long as everyone knows that companies are using that information for this reason, this is fine. However, how do you minimize the risk to the individual and company in the storage of long term data, while maximizing the utility of the information? But we also have incredibly useful data that could be potentially used to identify health trends and risk areas for the greater social good. How do we use this data as part of a larger link analysis without unique identifiers? There were no immediate answers to these complex questions. However, there is some vocal groups that oppose the storage of data long term so that it cannot be used for such trend analysis. Further, the more data stored, the more data is at risk of cybersecurity attacks and the more data that is at risk of being turned over to law enforcement. Companies have to quantify this customer experience benefit against the security and privacy risk.

How long is your digital snail trail, and where does it lead?

8. Future collaborative work. It was agreed that this is the golden age of surveillance. Everyone is leaving digital data everywhere. We need to fund agencies to ensure that despite this digital snail trail (the AmeriKat's words, not the more eloquent panel's), the public is kept safe. The panel suggested more funding for national security agencies to retain and train the best and brightest to help find a resolution of these issues from a technical and legal perspective. It was also suggested that we move beyond the encryption conversation as neither party - companies, nor law enforcement - want to be disadvantaged (as noted above). There also needs to be an increase to transparency. The panel noted that with the impending election, there will be new faces in government. However, it was hoped that the benefits and fruits of the collaboration between the private and government sectors in this area continue.

No comments:

All comments must be moderated by a member of the IPKat team before they appear on the blog. Comments will not be allowed if the contravene the IPKat policy that readers' comments should not be obscene or defamatory; they should not consist of ad hominem attacks on members of the blog team or other comment-posters and they should make a constructive contribution to the discussion of the post on which they purport to comment.

It is also the IPKat policy that comments should not be made completely anonymously, and users should use a consistent name or pseudonym (which should not itself be defamatory or obscene, or that of another real person), either in the "identity" field, or at the beginning of the comment. Current practice is to, however, allow a limited number of comments that contravene this policy, provided that the comment has a high degree of relevance and the comment chain does not become too difficult to follow.