Secure protocols for accountable warrant execution

Last week the press reported that the White House will seek to redesign the NSA’s mass phone call data program, so that data will be held by the phone companies and accessed by the NSA, subject to a new warrant requirement. The Foreign Intelligence Surveillance Court will issue the warrants.

Today Josh Kroll and I, with colleagues at Stanford University, released a draft paper on how to use cryptography to implement warrants to data in a secure, private, and accountable way.
Our solution is a set of multi-party cryptographic protocols involving three primary parties: a data source who has data records, an investigator who wants access to data held by the data source, and a court (or other authorizer) who issues an order or warrant to authorize access to a record. For example, a phone company might be the data source, the NSA might be the investigator, and the Foreign Intelligence Surveillance Court might be the court that issues an order. Alternatively, an email provider might be the data source, an FBI agent might be the investigator, and a senior FBI official might act as the “court” that issues a National Security Letter. Although we use words like “court”, “order”, and “investigator”, the protocol has wider application to situations where Party A is authorizing Party B to access data held by Party C, with legally defined requirements for access.

The protocol uses cryptography to guarantee several security, privacy, and accountability properties:

When the court issues an order, it publishes a sealed version of the order. If challenged later, the court can unseal the order and reveal which record it covered.

Until the order is unsealed, only the court and the investigator can see which record the order covers. If and when the order is unsealed, everyone can see which record it covered.

The investigator does not learn the contents of any record, unless there is a valid order for that record and the court has published a valid sealed version of that order.

A counterintuitive aspect of our protocols is that an order can be executed, thereby giving the investigator access to the record covered by the order, without the data source necessarily learning (at the time) which record the investigator accessed.

These properties can be viewed as a set of checks on the power of the parties, to prevent any dishonest party from getting access to information without leaving a suitable trail. When the trail itself is supposed to be secret, the protocol aims for accountability—for example, the court can issue an unjustified order but the court must commit to the order so that the violation will be uncovered if the court’s actions are challenged later.

Our paper gives more precise definitions of the desired properties, how the protocols work, and why the protocols achieve the desired properties. We build on the work of previous researchers, as cited in our paper, and we present several versions of the protocol, with different security properties.

Our approach is feasible, even for very large data sets. Our paper describes our work on implementing one of our more advanced protocols, and we show by experiment that the protocol is reasonably fast even for data sets of national scope. We have released the code we used to do these performance measurements.

We are releasing this paper now because there are important debates going on about how to organize lawful access to data by intelligence agencies. We want to make the point that technology allows these processes to be both more secure and more accountable.

We urge policymakers to consider how cryptography can make warrant regimes more secure for all parties, and more accountable. Expert agencies within government, such as NIST, might provide input on these issues, in consultation with experts inside and outside of government.

Systems like these could be useful for to make state agencies obey the courts, but that is not adequate to protect journalism and democracy from massive general surveillance. Once the state labels a whistleblower (i.e., a journalist’s source) as a criminal, it has grounds to get a search warrant to look through the database. So it
does not matter where the database is stored.

To protect whistleblowers and democracy, we must make anonymous communication possible.

Freedom to Tinker is hosted by Princeton's Center for Information Technology Policy, a research center that studies digital technologies in public life. Here you'll find comment and analysis from the digital frontier, written by the Center's faculty, students, and friends.