Don’t ask me what, but translating the works of others means something. And seeing as how I’ve been talking about PGP a bit lately, and seeing as how Mr. Bortzmeyer is so fucking important that he (even occasionally) sees it fit to write in an obscure language known mostly to Mauritians and Moroccans,i for your enlightenment and entertainment, let’s take this opportunity to translate (and adnotate!) his latest: << Sécurité, facilité d’usage, et les utilisateurs non-informaticiens >>ii

The issue of implementing digital security solutions comes up regularly in the InfoSec field. If there are so few Internet users employing cryptographic and other protective digital security tools, it’s not merely because the techniques are too complex that users prefer taking risks rather than trying to control them.

The question keeps coming up, and it’s an old one. The first articleiii to point to the problem of usability was “Why Johnny Can’t Encrypt” written by Alma Whitten and J.D. Tygar in… 1999.iv Little progress has been made in the debate since then, apart from a broad recognition, on the part of the InfoSec community, that usability is indeed crucial: the best security solutions in the world are useless if they’re too complicated to use.v Truly, the Snowden revelations have again reminded us of the importance of protecting ourselves against online espionage. Therefore, cryptography is reemerging as an object of interest. And also as a subject of critical debate. Two weeks ago, Matthew Greenvi wrote an article called “What’s the matter with PGP?” wherein he stated that this system is on the declinevii (note: this article rather confuses GPG software with the OpenPGP standard as well as maliciously and intentionally conflating key size and certificate size.)viii

More recently, in France, Ohkin in “Pas Sage en Seine” attacked InfoSec’ers with critical vigour, saying “the self-proclaimed greybeards of the internet, hacktivists, hackers, and sysadmins” have announced that “it’s our egos that are broken, it’s our sociopathological nihilism reacting to a political and social problem.” This is a theme taken up by Barbayellow in “Sécurité: pourquois ça ne marche pas”ix and by Numendil in “Chers nous…”x and also by Tris Acatrinei in “Une histoire de privilèges.”xi

I’ll let you read all these articles before seeing mine.xii I won’t try to refute these “usability is essential for security” arguments because they’re accurate. The vast majority of successful attacks don’t come from a brilliant hack of a cryptographic algorithm, they come from user errors.xiii But I would like to discuss a few other notions, such as the claim that the problem is 100% the responsibility of programmers/hackers/IT staff and the main difficulty in using security software is the lack of interest in usability on the part of technicians.

Let’s start with the practical problem: proclaiming the need to improve software, particularly with regards to ergonomics, is necessary but not sufficient. And it’s still to be determined how we can improve the existing ergonomics. I’ve attentively read the articles mentioned above and I’ve found there to be far more criticism of developers (eg. too much pride, autistic, contemptuous, and privileged) than any discussions of substantive solutions. It must be said that this is not a trivial problem. Even an IT professional has a hard time configuring HTTPS on their web server, or in using PGP (and what’s more, in using it error-free every day).xiv It’s not just having the skills either, it’s that, even with the basics in place, there isn’t enough time to work it through. As such, if we can create more accessible software, we wouldn’t have to spend an entire weekend generating a PGP key that meet the requirements of modern cryptography, and everyone would benefit.xv The question is “how?” Certain improvements are evident enough (eg. better PGP defaults). Others, less so. I’d like to see fewer articles that say “IT professionals are meanies who make software hard to use.” (I’m not demanding, I’m not asking for finished software here). As such, yes, today’s software could be improved considerably, but not everyone is convinced! What needs to be done now is to suggest specific improvement, which I haven’t yet seen.xvi Not that it isn’t a tough problem, far from Yakafokon. The article “Why Johnny Can’t Encrypt,” cited above, thoroughly explains the user interface problems. (It explains equally well why the lack of GUI is a non-issue “All this failure is despite the fact that PGP 5.0 is attractive, with basic operations neatly represented by buttons with labels and icons, and pull-down menus for the rest.”)

For an example of the difficulty in conceiving a simple and efficient user interface, take the the biggest problem in cryptography: key management.xvii One of the difficulties is that we retrieve public PGP keys from keyservers, validate them (an essential operation), and then store them. These complex operations are difficult to explain and it isn’t hard for mistakes to be made (eg. accepting a key without checking its validity). OTR function on the same principle whereby, unlike PGP, it’s limited to synchronous communication and we don’t have to retrieve the other party’s key. Can we do better? It all depends on what one is willing to give up in exchange. In a Twitter discussion, Ohkin sang the praises of the X.509 key model (used in TLS and thus HTTPS) for its simplicity: the user doesn’t have to validate anything. But Okhin forgot to mentionxviii that the price for this is the complete outsourcing of security to certificate authorities that are… pretty much worthless.If you want to protect your privacy, having to trust these companies isn’t the best idea in the world. The solutions to the problem of key management include SSH and the TOFU (Trust On First Use) Principle. It’s amusing that SSH is so rarely mentioned by those wanting to make software easier to use. SSH is successful precisely because of its user interface, which is easier to use than the concurrently released and less secure telnet. If we want an example of successful user interface design, we need not pay attention to TLS or OTR but rather the significant success of SSH. Though, even there, nothing is perfect. SSH is easy with TOFU: the first time connecting to a new server, the key is verified (in theory…) and it’s remembered going forward. In principle, this design is sound enough (and a reasonable compromise between usability and security) but it it’s still inconvenient: if a vulnerability is discovered after that first use, it’s difficult to change the keys…

Besides the notion of “compromise,” there’s another thing essential to digital security (if rarely mentioned by the writers that dreams of simpler software) and that’s another oft-forgotten concept: education. In general, people make such sweeping generalizations like “you don’t have to be an engineer to drive a car.” But that’s an illusion: you still need to learn to drive a car before you get your license. Not that you have to be an engineer, to be sure, but drivers have had to learn a lot, and it’s just that this knowledge is now commonplace so we don’t recognize it (see the hilarious text “What if people bought cars like they bought computers?“) In a society where more and more things are dependent on computers, that modern tools must be accessible without education is an illusion.xix On the contrary, we must develop digital literacy, which implies a basic level of security. Learning this literacy requires efforts on both sides. Demand for computers that don’t require training or effortxx will only maintain the damaging illusions.

When Barbayellow wrote: “[The job of journalists] isn’t to know how computers, servers, or even the Internet work,” he was mistaken. Today, so much of human activity (including journalism) requires the Internet that we must understand how it functions. Saying that journalists needn’t understand the Internet is like saying “knowing the law is not our business” when most businesses today require some degree of legal culture. And saying that you don’t need to understand the Internet in 2014 is like saying that you don’t need to learn to read to be a publishing professional in 1450.

Fin.

___ ___ ___

It looks like I may be heading to Morocco in early 2015. Let me know if you’ve any restaurants to recommend. [↩]

That is, “Security, ease-of-use, and non-computer users.” Feel free to bring up corrections in the comments section or on #bitcoin-assets. [↩]

It was the first, biggest, bestest, neh? Seeing as how PGP was already 7 years old at this point, it was likely no more the first than a CoinDesk article. [↩]

Here, Bortzy had the audacity to link to the French Retardopedia page of the year 1999. For fucking cereal. I will repeat no such idiocy. [↩]

This is, of course, patent nonsense, and points rather to the need for more capable people within an organization. Digital security isn’t about being “easy” or “better than nothing,” it’s about making attacks expensive. That’s it. [↩]

Some derp at John Hopkins University, who calls himself a “Cryptographer and Research Professor” instead of the more apt “USG mole.” [↩]

“This system” is essentially the WoT, and contrary to Green’s imaginings, it’s getting stronger by the day. [↩]

To the surprise of exactly no one. They even have Core Dev/Power Ranger Jeff Garzik trying to take PGP/GPG down a notch with his “please to in-browser” brain damage, as if anyone gives a fuck what the Core Devs think after they flung their Heartbleed shit at the wall. And missed. [↩]

15 thoughts on “InfoSec Education: Because Stéphane Bortzmeyer Is Lazy. And I’m Not.”

Thanks for the nice translation. I hope it will bring me more readers from the parts of the world that still do not speak french :-)

There is a misunderstanding in “the lack on interest in technician usability”. I was talking about the (supposed) lack of interest of the technical people for usability, not about the usability of technicians.

This proposition that the security of secure programs can be improved by making them “easier” or “more convenient” to use by the sorts of users that can’t be arsed to use them in the first place is pure nonsense. It proceeds from the nonfactual and unwelcome presumption that one user = another user, and from there it draws fundamentally broken inferences such as “50% of users being owned is worse than 5% of users being owned”.

The idiocy of this last example is easy to make obvious : if in situation A the program can be used securely, but 50% of users do not bother, whereas in situation B the program can not be used securely in 5% of cases no matter what but everyone gets the rest correctly then clearly B is a broken implementation that shouldn’t exist and A is the right thing.

Forget about the users that don’t know how to use software, don’t want to learn how to use software, intentionally lie about their refusal to learn by misrepresenting it as “not having time”* and all the rest of the typical things poor/stupid people do. The point of existence is not and can not be catering to the shit draining through the holes in the floor. There’s absolutely no reason that the consumer should have anything nice. Ever. Fuck him, let him die in a pool of his own vomit.

—
* Somehow they “have time” to fill gigabytes with pornography and keep the world appraised of their ever so valuable, shockingly predictable ideation on half a dozen supposedly different “social media” platforms. They just… you know, don’t have the time to do anything useful is all.

[…] still be no match for the fallibility of our own human nature. Even if we’re completely digitally literate, there’s a fair to middling chance that our opsec is dependent on people who are anything […]

[…] immunogens this year, while slightly expanding its ranks. 2. PGP/GPG: despite malicious attempts at misinformation, the use of the gold standard in encrypted communication and digital identity confirmation grew […]