SpecterA few thoughts on crypto, systems, and securityhttp://www.mit.edu/~specter/
Fri, 27 Jan 2017 00:36:06 -0500Fri, 27 Jan 2017 00:36:06 -0500Jekyll v3.3.1On Deniability and Duress<p>Imagine you’re at a border crossing, and the guard asks you to hand over all of your electronics for screening. The guard then asks that you unlock your device, provide passwords and decryption keys. Right now, he’s asking nicely, but he happens to be carrying an unpleasant-looking rubber hose,<label for="sn-id-1" class="margin-toggle sidenote-number"></label><input type="checkbox" id="sn-id-1" class="margin-toggle" /><span class="sidenote">Yes, cryptographers actually do call this <a href="https://en.wikipedia.org/wiki/Rubber-hose_cryptanalysis">“rubber hose cryptanalysis.”</a> </span> and appears to be willing to use it. Now imagine you’re a journalist covering war crimes in the country you’re trying to leave. So, what can you do?<label for="sn-id-xkcd" class="margin-toggle sidenote-number"></label><input type="checkbox" id="sn-id-xkcd" class="margin-toggle" /><span class="sidenote"><a href="https://xkcd.com/538/">Obligatory XKCD</a> </span></p>
<p>This isn’t a hypothetical situation. The Freedom of the Press Foundation <a href="https://freedom.press/news/over-150-filmmakers-and-photojournalists-call-major-camera-manufacturers-build-encryption-their-cameras/">published an open letter</a> to camera manufacturers requesting that they provide “encryption” by default. The thing is, what they want isn’t just encryption, it’s <em>deniability</em>, which is a subtly different thing.</p>
<p>Deniable<label for="sn-id-2" class="margin-toggle sidenote-number"></label><input type="checkbox" id="sn-id-2" class="margin-toggle" /><span class="sidenote">I consider deniability in the tradition of <a href="http://link.springer.com/chapter/10.1007/BFb0052229">Canetti et al</a>. It’s important to note that deniability refers to the ability to deny some plaintext, not the ability to <em>deny that you’re using a deniable algorithm</em>. </span> schemes let you lie about whether you’ve provided full access to some or all of the encrypted text. This is important because, currently, you can’t give the guard in the above example a fake password. He’ll try it, get locked out, and then proceed with the flogging.</p>
<p>I’m convinced that there’s a sociotechnical blind spot in how current technology handles access to personal devices. We, in the infosec community, need to start focusing more on allowing users the flexibility to handle situations of duress rather than just access control. Deniability and duress codes can go a long way in helping us get there.</p>
<h2 id="some-us-legal-context">Some U.S. Legal Context</h2>
<p>Recent events in law have highlighted the need for deniability and duress codes in particular.</p>
<p>In particular, a recent precedent-setting court case in Minnesota<label for="sn-id-whatever" class="margin-toggle sidenote-number"></label><input type="checkbox" id="sn-id-whatever" class="margin-toggle" /><span class="sidenote">Full court opinion here: <a href="http://www.mncourts.gov/mncourtsgov/media/Appellate/Court%20of%20Appeals/Holiday%20Opinions/OPa152075-011717.pdf">Minnesota V. Diamond</a> </span> has decided that fingerprints used for access control can be taken from a suspect without violating his fifth amendment rights. The logic of the decision, which I’m actually inclined to agree with, is that fingerprints are tantamount to similar evidence that is taken from suspects in the course of an investigation such as blood samples, handwriting samples, voice recordings, etc., all of which have been deemed by the Supreme Court to <strong>not be protected</strong> under the Fifth Amendment.</p>
<p>Orin Kerr has a great in-depth analysis of this decision <a href="https://www.washingtonpost.com/news/volokh-conspiracy/wp/2017/01/18/minnesota-court-on-the-fifth-amendment-and-compelling-fingerprints-to-unlock-a-phone/?utm_term=.6da68ccb4c35">here</a>, but the gist is that the courts have decided that fingerprints don’t count as a “testimonial,” and therefore aren’t protected under the fifth amendment.</p>
<p>There’s an interesting wrinkle to the case in that the defendant willingly told the police which finger would have unlocked the phone. Admittedly, the court could just demand that the guy provide all of his fingerprints and try each of them in a row. If we take this to an extreme, this is not too different from arguing that the police have a right to try to crack a password for the device that they’ve gotten legally, it just happens to be that the characters of the password are physical objects.<label for="sn-id-3" class="margin-toggle sidenote-number"></label><input type="checkbox" id="sn-id-3" class="margin-toggle" /><span class="sidenote">Well, in this case, the defendant’s fingers. </span></p>
<p>The good news is that other decisions have decided that passwords are constitutionally protected. In the esoterically-named “In re Grand Jury Subpoena Duces Tecum”,<label for="sn-id-4" class="margin-toggle sidenote-number"></label><input type="checkbox" id="sn-id-4" class="margin-toggle" /><span class="sidenote">Specifically, <a href="http://stanford.edu/~jmayer/law696/week8/Compelled%20Password%20Disclosure%20(Eleventh%20Circuit).pdf">“In re Grand Jury Subpoena Duces Tecum”, 670 F.3d 1335 (11th Cir. 2012)</a> </span> it was decided that traditional passwords <em>are</em> incriminating testimonial, and therefore that defendants can plead the fifth when asked.</p>
<p>However, the bad news is that hand-typed passwords are increasingly seen as the way of the past; hardware tokens and biometric sensing are considered to be far more usable, and will likely be employed more and more in the future.
<label for="sn-id-4" class="margin-toggle sidenote-number"></label><input type="checkbox" id="sn-id-4" class="margin-toggle" /><span class="sidenote">Google <a href="https://www.theguardian.com/technology/2016/may/24/google-passwords-android">appears to be moving to hardware tokens and biometrics</a> for instance, which is a much more usable instrument </span></p>
<h2 id="what-we-can-do-quickly-add-duress-codes">What We Can do Quickly: Add Duress Codes</h2>
<p>As mentioned earlier, a key observation from these court cases is that the police can compel you to hand over a fingerprint, but cannot order you to tell the police which finger is used to unlock the device. This would be tantamount to ordering you to provide a passcode.</p>
<p>In the short term, Apple and Google can take steps to alleviate this threat by adding duress codes into their access control mechanisms. For instance, scanning anything but your right index finger might force a password-only lock. Scanning a pinky (or some other fingerprint / combination of fingerprints) might cause the phone to factory reset, or unlock and trigger deletion a specified portion of user data. Adding this functionality might take a few weeks of coding and months of UX research, but it can easily help make the current constitutional crisis void.</p>
<p>In the long term, we need to rethink deploying deniability as a set of strategies for helping users evade coercion in general. What is similarly important is that <em>all</em> devices must have some sort of deniability baked-in, full stop. Adding deniable systems to devices only when that person is targeted provides little protection to at-risk populations like journalists. If it isn’t baked-in to the operating system, the fact that the journalist was using some out-of-the-ordinary software itself, which may or may not have undeniable tells, would likely be a red flag and induce liberal use of the rubber hose.</p>
<p><em>– <a href="/~specter/">Mike Specter</a> PhD candidate in computer science at MIT, with thanks to Danny Weitzner (principal research scientist), Jonathan Frankle (also a PhD candidate at MIT) and the rest of the <a href="https://internetpolicy.mit.edu/">Internet Policy Research Initiative</a></em></p>
Tue, 24 Jan 2017 00:00:00 -0500http://www.mit.edu/~specter/articles/17/deniability1
http://www.mit.edu/~specter/articles/17/deniability1post