Today, the vulnerable state of electronic communications security dominates headlines across the globe, while surveillance, money and power increasingly permeate the ‘cybersecurity’ policy arena. With the stakes so high, how should communications security be regulated? Deirdre Mulligan (UC Berkeley), Ashkan Soltani (independent, Washington Post), Ian Brown (Oxford) and Michel van Eeten (TU Delft) weighed in on this proposition at an expert panel on my doctoral project at the Amsterdam Information Influx conference. [Read more...]

As a computer scientist who studies Privacy-Enhancing Technologies, I remember my surprise when I first learned that some groups of people view and use them very differently than I’m used to. In computer science, PETs are used for protecting anonymity or confidentiality, often via application of cryptography, and are intended to be bullet-proof against an adversary who is trying to breach privacy.

By contrast, Helen Nissenbaum and others have developed a political and ethical theory of obfuscation [1], “a strategy for individuals, groups or communities to hide; to protect themselves; to protest or enact civil disobedience, especially in the context of monitoring, aggregated analysis, and profiling..” CV Dazzle and Ad Nauseam are good examples.

[This is a guest post by Wenley Tong, Sebastian Gold, Samuel Gichohi, Mihai Roman, and Jonathan Frankle, undergraduates in the Privacy Technologies seminar that I offered for the second time in Spring 2014. They did an excellent class project on the usability of email encryption.]

PGP and similar email encryption standards have existed since the early 1990s, yet even in the age of NSA surveillance and ubiquitous data-privacy concerns, we continue to send email in plain text. Researchers have attributed this apparent gaping hole in our security infrastructure to a deceivingly simple source: usability. Email encryption, although cryptographically straightforward, appears too complicated for laypeople to understand. In our project, we aimed to understand why this problem has eluded researchers for well over a decade and expand the design space of possible solutions to this and similar challenges at the intersection of security and usability.

Earlier this week, Felten made the observation that the government eavesdropping on Lavabit could be considered as an insider attack against Lavabit users. This leads to the obvious question: how might we design an email system that’s resistant to such an attack? The sad answer is that we’ve had this technology for decades but it never took off. Phil Zimmerman put out PGP in 1991. S/MIME-based PKI email encryption was widely supported by the late 1990′s. So why didn’t it become ubiquitous?[Read more...]

The big NSA revelation of last week was that the agency’s multifaceted strategy to read encrypted Internet traffic is generally successful. The story, from the New York Times and ProPublica, described NSA strategies ranging from the predictable—exploiting implementation flaws in some popular crypto products; to the widely-suspected but disappointing—inducing companies to insert backdoors into products; to the really disturbing—taking active steps to weaken public encryption standards. Dan wrote yesterday about how the NSA is defeating encryption.

To understand fully why the NSA’s actions are harmful, consider this sentence from the article:

Many users assume — or have been assured by Internet companies — that their data is safe from prying eyes, including those of the government, and the N.S.A. wants to keep it that way.

In security, the worst case—the thing you most want to avoid—is thinking you are secure when you’re not. And that’s exactly what the NSA seems to be trying to perpetuate.[Read more...]

Freedom to Tinker is hosted by Princeton's Center for Information Technology Policy, a research center that studies digital technologies in public life. Here you'll find comment and analysis from the digital frontier, written by the Center's faculty, students, and friends.