We are only just beginning to define this emerging field and we want you to be a part of that process. Today, roughly 40 percent of IAPP membership are lawyers who may want to join the Privacy Bar Section. For the remaining majority, we hope a large number will find a home in the Privacy Engineering Section. If your work in any way engages you in either developing, assessing or using methodologies, tools and techniques that seek to engineer privacy into systems, the IAPP Privacy Engineering Section is for you.

President Obama has appointed Professor Annie Antón to be one of twelve members of the Commission on Enhancing National Cybersecurity. The announcement describes the Commission’s task as follows:

The Commission is tasked with making detailed recommendations on actions that can be taken over the next decade to enhance cybersecurity awareness and protections throughout the private sector and at all levels of Government, to protect privacy, to ensure public safety and economic and national security, and to empower Americans to take better control of their digital security.

She will be serving with an extremely distinguished non-partisan group of experts:

General Keith Alexander, USA (Ret) – Chairman and CEO of IronNet, and former director of the National Security Agency.

Patrick Gallagher – Chancellor and CEO of the University of Pittsburgh.

Peter Lee – corporate vice president of Microsoft Research.

Herbert Lin – Senior Research Scholar for Cyber Policy and Security at the Center for International Security and Cooperation and a Research Fellow at the Hoover Institution, both at Stanford University.

Heather Murren – private investor and member of the Board of Trustees of the Johns Hopkins University and the Johns Hopkins University Applied Physics Laboratory.

Joe Sullivan – chief security officer at Uber.

Maggie Wilderotter – Chief Executive Officer of Frontier Communications from 2004 to 2015, and then Executive Chairman of the company until April 1, 2016.

Posted by Aaron Massey in Uncategorized | Comments Off on Prof. Antón appointed to the President’s Commission on Enhancing National Cybersecurity

Professors Antón and Swire have an op-ed in the Atlanta Journal Constitution about the increasing importance of protecting healthcare data. It’s difficult to summarize an issue as complex as protecting privacy in healthcare information technologies, but this op-ed does it well.

The Apple iOS8 phone and the latest Google Android phone claim to establish landmark privacy protections by establishing encryption by default. According to Apple and Google, they will be unable to “open” the phone for anyone, not even law enforcement. These new measures have been sharply criticized by the Director of the FBI and the Attorney General. As a software engineering professor, I’ve devoted my career to teaching students how to develop (a) secure, (b) privacy preserving, and (c) legally compliant software systems. I’m not qualified to debate whether or not this move by Apple and Google is lawful or constitutional. However, as a technologist I can assert that applying security best practices will yield a system that can withstand intrusions and denial of service attacks, limits access to authenticated and authorized users, etc.

The recent “encryption by default” design decision by Apple and Google is currently being discussed in software engineering and security classes across our nation, and perhaps across the globe. By default, privacy and security researchers, technologists and activists applaud this decision because it is raising the bar for truly implementing security best practices. It’s a bitter pill to swallow for professors who teach students to develop secure, privacy preserving, and legally compliant software, to have our students be told on the job, “Oh, that stuff you learned about security back in school? We only want you to secure the system part way, not all the way. So, leave in a back door.” Such a position undermines those academic institutions seeking to prepare tomorrow’s security and privacy workforce in an ever-changing world where sophisticated criminals are getting smarter and their offensive techniques are surpassing our ability to stay ahead.

From my experience working with government agencies, I thoroughly understand the desire to “catch the bad guys” and value the ability to prevent malicious criminal activity by individuals or nation states. I want our government, Department of Homeland Security, Department of Defense and Intelligence Community to protect us from the unfathomable. I find myself wondering why the very institutions who promote security and privacy best practices (via, for example, centers of excellence at our nation’s top universities) are so vehemently opposed to industry actually implementing best practices. My analysis yields two observations:

Taking the Easy Way Out. For law enforcement to expect companies to provide the government with back door access (even when required by law), seems to me to be the lazy approach. If one reads between the lines, one could infer that the government is lacking the incentives and/or the will to innovate and improve the state of the art in cyber offense. Where’s the spirit of the scientists and engineers who enabled man to walk the moon? Where’s the American will to innovate, to surpass the state of the art, and be the best? Why let other nations beat us at our own game? The only way we can get better at offense is by facing the best possible defense. At a time when other nation states are getting so sophisticated, we risk not developing our own capabilities if we rely on an easy backdoor rather than honing our own skills. We need to keep ourselves sharp by learning how to confront the state of the art systems. If we aren’t staying ahead of the curve then other countries and their intelligence services will have reason to develop capabilities beyond our agencies when we’re relying on these factors.

Creating a Backdoor for Use in Other Countries. If the United States expects companies to provide a back door to gain access to systems and the data that resides in those systems, then other governments will, too. We can’t well expect Apple or Google to provide a backdoor to the U.S., but not to China or Russia. At least in the United States, we have a legal framework that requires search warrants, etc. to gain access via the backdoor. But many other countries lack these legal safeguards and will require the phone companies to enable snooping into the systems within those countries with no legal protections comparable to US system. As security engineers have learned in many other systems, you can’t build a vulnerability that is used only by the good guys and not by others.

I certainly empathize with law enforcement’s desire to gain evidence for critical investigations. But Congress and the White House have agreed that cybersecurity should be funded as a national priority. As professors of computer security, we can’t teach the importance of building secure systems and then explain to our students that we will leave tens of millions of devices insecure.

Dr. Annie I. Antón is a Professor in and Chair of the School of Interactive Computing at the Georgia Institute of Technology in Atlanta. She has served the national defense and intelligence communities in a number of roles since being selected for the IDA/DARPA Defense Science Study Group in 2005-2006.

The National Science Foundation recently awarded researchers from The Privacy Place a grant to work on Regulatory Compliance Software Engineering with UCON_LEGAL! You can read the abstract below. More details are available at research.gov.

Abstract: Software engineers need improved tools and methods for translating complex legal regulations into workable information technology systems. Compliance with legal requirements is an essential element in trustworthy systems. The research proposed herein will advance the cutting edge for creating more accurate, efficient, and reliable RCSE (Regulatory Compliance Software Engineering), resulting in compliant software systems. System specifications typically concentrate on system-level entities, whereas legal discussions emphasize fundamental rights and obligations discursively. This work bridges three cultures of scholarship and research: software specification, law, and access control. By empowering software developers and policy makers to better understand regulatory texts and the access controls specified within these texts, current and future software systems will be better aligned with the law.

There are three main expected results of this work: (1) Framework, methodology and heuristics to identify UCONLEGAL components in legal texts; (2) extended TLA (Temporal Logic of Actions) rules from UCONABC and mapping of predicates, actions, states, variables and obligations between UCONLEGAL and UCONABC; (3) validated and extended role-based access controls to meet healthcare and financial legal requirements through further development of UCONLEGAL. The impacts of this work are expected to be far reaching; law and regulations govern the collection, use, transfer and removal of information from software systems in many sectors of society, and this research explicitly calls for models and theories for analyzing and reasoning about security and privacy in a regulatory and legal context.

Posted by Aaron Massey in Research | Comments Off on NSF Grant on Regulatory Compliance Software Engineering

Last month, I testified before the House Ways and Means Social Security Subcommittee hearing on the Social Security Administration’s Role in Verifying Employment Eligibility. My testimony focused on the E-Verify pilot system, and the operational challenges the system faces. According to the U.S. Citizenship and Immigration Services website, E-Verify “is an Internet-based system that allows businesses to determine the eligibility of their employees to work in the United States.” The goal of E-Verify – to ensure that only authorized employees can be employed in the U.S. – is laudable. However, the E-Verify pilot system is still in need of major improvements before it should be promoted to a permanent larger-scaled system.

Yesterday afternoon, Dr. Antón testified before the Subcommittee on Social Security of the U.S. House of Representatives Committee on Ways and Means on behalf of the USACM about E-Verify. Here’s part of the official ACM press release on the testimony:

WASHINGTON – April 14, 2011 – At a Congressional hearing today on the Social Security Administration’s role in verifying employment eligibility, Ana I. Antón testified on behalf of the U.S. Public Policy Council of the Association for Computing Machinery (USACM) that the automated pilot system for verifying employment eligibility faces high-stakes challenges to its ability to manage identity and authentication. She said the system, known as E-Verify, which is under review for its use as the single most important factor in determining whether a person can be gainfully employed in the U.S., does not adequately assure the accuracy of identifying and authenticating individuals and employers authorized to use it. Dr. Antón, an advisor to the Department of Homeland Security’s Data Privacy and Integrity Advisory Committee and vice-chair of USACM, also proposed policies that provide alternative approaches to managing identity security, accuracy and scalability.