This is not a novel suspicion nor an exotic point of view within the actuali computer security community. Nevertheless I wouldn't have said anything, because honestly... who cares. Well meaning people keep recommendingii his stuff to me however, which is getting annoying. I would like a ready reference to be able to explain my position with a single link in the future, so here we go.

Since I started working with Snowden's documents, I have been using a number of tools to try to stay secure from the NSA. The advice I shared included using Tor, preferring certain cryptography over others, and using public-domain encryption wherever possible.

The recommendation to still use Toriii is already a declaration of affiliation. Which, whatever... everyone's free to choose a camp in this world.

In principle preferring certain types of cryptography over others is sound. The problem seems to be that Schneier prefers exactly the wrong types (ECC). I don't know that there's anything more suspicious in all crypto currently than ECC, and while suspicion and educated guesswork might not interest you, the fact still remains that ECC is, historically, the most frequently attacked by the NSA. Review the record.

Public domain encryption is probably the correct choice, but it should not be in any case regarded as ironclad. The "million eyes" theory is mostly a mythiv, as well exemplified by the Bitcoin codebase. While perfectly open, and while offering clever folks a great financial incentive to find flaws, nevertheless it was two years before anyone read or comprehended the database code it was relying on to a degree sufficient to understand how it was being initialised. And it took a massive fork that nearly killed the project back in March to get people even looking at that part of the code. Thus your notion that public domain software has been already reviewed by countless experts at no cost or expense to you may well be misplaced.

I also recommended using an air gap, which physically isolates a computer or local network of computers from the Internet. (The name comes from the literal gap of air between the computer and the Internet; the word predates wireless networks.)

This is a great recommendation. Air gaps are an essential building block of COMSEC, or at least of COMSEC that works at any rate.

Air gaps might be conceptually simple, but they're hard to maintain in practice. The truth is that nobody wants a computer that never receives files from the Internet and never sends files out into the Internet. What they want is a computer that's not directly connected to the Internet, albeit with some secure way of moving files on and off.

This is also largely correct, minus the insistence on "files". You can have perfectly functional airgaps through the use of a QR reader gun on each end. I know at least some MPEx users employ this method to pass signed orders to their live connection and receipt responses back to their airgapped machine. It's a really neat solution that however involves no files - and in fact I would hold that filelessness as a strong point of the entire system - darned hard to hack a QR gun via a QR code.

The intervening considerations are also correct, sure, Obama Osama used one, all that jazz.

And air gaps have been breached. Stuxnet was a US and Israeli military-grade piece of malware that attacked the Natanz nuclear plant in Iran. It successfully jumped the air gap and penetrated the Natanz network. Another piece of malware named agent.btz, probably Chinese in origin, successfully jumped the air gap protecting US military networks.

This, to the best of my knowledge, is factually incorrect. While it's possible Schneier knows something I don't, as far as I know Stuxnet did not cross any airgap. It was carelessly spread by poor security procedures. I suppose you can take the hardline that "if there's an airgap at any place in the design and stuff nevertheless gets across you may claim the airgap was breached, even if it was in fact the techs bypassing the airgap". This argument is perfectly solid in some contexts, such as in the post-mortem analysis of the defeated system. This argument is utterly facetious in a general discussion of airgaps, about akin to saying that "passwords offer no security because in at least one case an employee kept a post-it with his passwords on his monitor and so the workstation was trivially subverted". Obviously that subversion didn't involve password security, it was a social engineering attack through and throughv.

Agent.btz on the other hand definitely did not cross any airgaps, it simply benefited from the atrociously careless and irresponsible behaviours of contemporary US SIGINT people.vi

Back to reality : breaching an airgap is insanely difficult, and - much like RSA factorisation - an unsolved problem.

Since working with Snowden's NSA files, I have tried to maintain a single air-gapped computer. It turned out to be harder than I expected, and I have ten rules for anyone trying to do the same:

1. When you set up your computer, connect it to the Internet as little as possible.

Any piece of hardware that was ever at any point in its life connected to the Internet can no longer be used as part of an airgapped system. Period.

And since we're on the topic, any girl that has been having as little sex as possible while taking great care for every round to last less than thirty seconds still isn't a virgin. Not no more. Funny how that works, and weird that I should have to point this out.

It's impossible to completely avoid connecting the computer to the Internet

For one, there existed computers before there was an Internet. They worked just fine - heck, in many cases they worked a lot better than the current Windows / Cloud crapola.

I purchased my computer off-the-shelf in a big box store

This is a good idea.

then went to a friend's network and downloaded everything I needed in a single session

This is a horribly bad, retarded idea. For one, the "friend" belies gross unfamiliarity with the principled core of COMSEC. That is to say, if you use a third party, it's either an unrelated third party or else it's related for a very good reason that's part of the objectives. Thus, using your own computer, or using a random wireless connection are the available options. Using the customer's connection, if you're working for one and they want you to may also be fine. Using "a friend", ie a related entity that's related in a way that's not related to your project nor justified by it is exactly, but I say exactly the hole through which the night comes in. Just go wan raiding already.

For the other, you really don't need to download anything on your airgapped computer. You can always download your favourite Linux CD distro, checksum it, burn it and live a happy life subsequently. And if even for a moment you believe yourself involved in COMSEC off Windows computers you really have to re-read footnote #5 above.

The ultra-paranoid way to do this is to buy two identical computers, configure one using the above method, upload the results to a cloud-based anti-virus checker, and transfer the results of that to the air gap machine using a one-way process.

This is so stupid a plan it beggars belief. For one, anything to do with "cloud-based" is about as antithetical to what we're doing here I'm beginning to suspect a certain three letter agency has a deal with the various cloud operators to somehow diddle everything on the go. There's just no way to explain why anyone would be proposing this cargo cult voodoo otherwise. Be that as it may, do not, under any circumstances and for any reason involve "the cloud" in your airgapping efforts. It does not belong there.

And for that matter, antiviruses are a myth. They roughly work in the same manner US budget ceilings work, their only security implication being that if you're in a space that uses antiviruses you know you're in a space that's never going to be secure.

2. Install the minimum software set you need to do your job, and disable all operating system services that you won't need. The less software you install, the less an attacker has available to exploit.

This is good advice.

I downloaded and installed OpenOffice, a PDF reader, a text editor, TrueCrypt, and BleachBit.

This is patent nonsense, antithetical to the aformentioned good advice. For one, OpenOffice is a dead project, and has been for a long long time. This means they're no longer keeping up with the world. For the other, a PDF reader is in no conceivable way needed on an airgapped machine, ever. Also, the PDF format is so fucking dangerous that it, together with Windows style WYSIWYG editors and their braindamaged attempts to become emacs through misimplementing macros etc create most of the ownage cases that don't involve a browser. Get rid of them, you don't need bulleted lists and power point presentations on your airgapped machine. Airgap it from stupid too, since you're going to all this trouble.

(No, I don't have any inside knowledge about TrueCrypt, and there's a lot about it that makes me suspicious. But for Windows full-disk encryption it's that, Microsoft's BitLocker, or Symantec's PGPDisk -- and I am more worried about large US corporations being pressured by the NSA than I am about TrueCrypt.)

Windows full disk encryption does not work. At all. It's pure snake oil.

Linux full disk encryption does not really work, for that matter. If the sorts of problems that full disk encryption is intended to resolve actually are security considerations for you, the correct approach is something akin to keeping a half gallon bottle of oil of vitriol close by at all times and the hard drive exposed and within reach, because no "full disk encryption" will keep your data safe from an attacker with physical access to your full disk and enough time and gadgetryvii on their hands.

The correct solution for the vast majority of applications is, of course, to simply gpg whatever you want to keep secret and forget about it. A properly made passphrase will keep your attacker sucking air out of a can indefinitely.

3. Once you have your computer configured, never directly connect it to the Internet again. Consider physically disabling the wireless capability, so it doesn't get turned on by accident.

For that matter, consider driving a nail through the wireless controller.

If you need to install new software

You don't. Ever. An airgapped system may never be upgraded no matter what happens, come hell or high water. It's salted and brined as is.

If you absolutely must have some spanking new feature, you're making a new airgapped machine. If making a new airgapped machine isn't worth the hassle you don't need whatever feature.

Turn off all autorun features. This should be standard practice for all the computers you own, but it's especially important for an air-gapped computer. Agent.btz used autorun to infect US military computers.

Right. So much derp & lol in here I don't even.

Minimize the amount of executable code you move onto the air-gapped computer. Text files are best. Microsoft Office files and PDFs are more dangerous, since they might have embedded macros. Turn off all macro capabilities you can on the air-gapped computer. Don't worry too much about patching your system; in general, the risk of the executable code is worse than the risk of not having your patches up to date. You're not on the Internet, after all.

Just get rid of the damned Microsoft stuff, and of the damned "office" crap.

Only use trusted media to move files on and off air-gapped computers. A USB stick you purchase from a store is safer than one given to you by someone you don't know -- or one you find in a parking lot.

This is true, if the file obsession a little out of place. Seriously, the world is not made out of files, the world is made out of data streams.

For file transfer, a writable optical disk (CD or DVD) is safer than a USB stick. Malware can silently write data to a USB stick, but it can't spin the CD-R up to 1000 rpm without your noticing. This means that the malware can only write to the disk when you write to the disk. You can also verify how much data has been written to the CD by physically checking the back of it. If you've only written one file, but it looks like three-quarters of the CD was burned, you have a problem. Note: the first company to market a USB stick with a light that indicates a write operation -- not read or write; I've got one of those -- wins a prize.

This is pure voodoo, seriously. So it can only write when you write, what sort of defense is that ? You will write, won't you ? You've written "one file" and 3/4 of the CD appears burned to visual inspection ? What's this, an application of the "64kb files should be enough for everyone" principle in tandem with the "Holy Mary Full of Grace" approach to security/birth control ?

Visual inspection of a CD is about as likely to catch that three and a half kb piece of malware as TSA screening is to catch terrorists. Forget about nonsense ritual and do things that have an impact on actual security rather than your subjective impression of security. Stroking that later is how the former gets compromised. Fucking CD auguration, I can not believe this.

What if the CD is sorta greenish ? Is that bad ? Cause you know, green is for poison, maybe we should only use blue tinted CDs. Those are cool, right ?

When moving files on and off your air-gapped computer, use the absolute smallest storage device you can. And fill up the entire device with random files. If an air-gapped computer is compromised, the malware is going to try to sneak data off it using that media. While malware can easily hide stolen files from you, it can't break the laws of physics.

Nor can it rewrite anything. It is a point of fact that no malware ever rewrites anything. Like, ever. At all. It just starts new processes and creates new "files" for everything. Always. It's in the rules, when Schneier's God created Schneier's Enchanted Universe of Strangely Insecure Computer Security, he made this a rule of Schphysics.

Consider encrypting everything you move on and off the air-gapped computer. Sometimes you'll be moving public files and it won't matter, but sometimes you won't be, and it will. And if you're using optical media, those disks will be impossible to erase. Strong encryption solves these problems. And don't forget to encrypt the computer as well; whole-disk encryption is the best.

Whole disk encryption is retarded in all those implementations which require the system to have the keys in order to use that disk. Strong encryption of individual files is always a good idea. That strong encryption means one thing and one thing only : gpg --encrypt --armor -r yourkey

One thing I didn't do, although it's worth considering, is use a stateless operating system like Tails. You can configure Tails with a persistent volume to save your data, but no operating system changes are ever saved. Booting Tails from a read-only DVD -- you can keep your data on an encrypted USB stick -- is even more secure. Of course, this is not foolproof, but it greatly reduces the potential avenues for attack.

I have no idea about Tails. Maybe. It's in general a bad idea to make exotic systems your team doesn't understand well central to your security. Debian Sarge is, for this reason, a much better choice than whatever tails.

Yes, all this is advice for the paranoid

Sadly, a large chunk of it is advice for the delusional. They do go together, delusional and paranoid, but not to any sort of useful effect.

And it's probably impossible to enforce for any network more complicated than a single computer with a single user. But if you're thinking about setting up an air-gapped computer, you already believe that some very powerful attackers are after you personally. If you're going to use an air gap, use it properly.

Security is never "probably impossible". Anything you wish to do is quite possible, just as long as you don't expect Windows, whole disk encryption and the cloud to do it for you. That aside, yes, if you're going to the trouble of doing anything, anything whatsoever, do it properly. That includes not relying on Windows etc.

Of course you can take things further. I have met people who have physically removed the camera, microphone, and wireless capability altogether. But that's too much paranoia for me right now.

I've not met any people who haven't. If you use a laptop for the purpose of being your cold computer, you definitely want to rape the cam and the mic (some models come with a 2nd mic hidden in the chassis btw). And drive a nail through the fucking wireless controller.

In closing, I would like to stress that none of this negates Schneier's past accomplishments or achievements. Applied Cryptography is still a great book. It's just that I don't trust him, not anymore.

———

As distinguished from people who read and perhaps write blogs on the topic of computer security. [↩]

That contrary to planted disinformation of which the Guardian article is a fine example, the NSA has complete and unlimited, instantaneous access to any and all information passed through the TOR network in its entirety, as a matter of course and by design.

That article even includes an oblique reference to Schneier in footnote 3. [↩]

Roughly a restatement of the well known Parkinson law of triviality. While it's true that banal parts of the code garner a lot of posturing and discussion (chiefly from people desperate for a little bit of spotlight), it's equally true that the "obscure", the unsexy parts of the code are passed on by everyone, in the hollow faith that "someone else", "everyone else" will not be quite as "clever". Seriously, that's the thought process, "I will pass over this function here because reading it is unlikely to make me famous and everyone else has read it anyway. Because everyone else is not at all likely to think the same way, because I am an unique snowflake of brilliance with my own, personal and independent ideas, quite unlike everyone else." [↩]

Since we're on the topic : the French were pretty much isolated in the NATO SIGINT community of the Cold War days, because everyone suspected that they'll leak whatever they're given. And so as long as there were French officers in the room, nobody discussed any serious intelligence.

Let it be clearly stated that while on occasion they did have trouble keeping a lid on things, it was rare that they actually lost anything. By contrast, the Americans at the time were regarded as grossly incompetent, and with few and far between exceptions basically worthless as field agents. Nevertheless, they did have deep pockets, a propensity for cool gadgets and a solid capacity of keeping data safe. Consequently, they were accepted as equals by their more able European counterparts, those people who made it their daily business to actually confront the devilishly competent Russians and the even worse Romanians on a daily basis.

And yet here we are, the US is leaking reams of data on a yearly basis. Derpies, you're worse than the French ever were by degrees of magnitude! Wake up, do something, you're becoming a laughingstock. There are fucking African states with a much better grasp on things, what the hell are you doing! [↩]

In principle preferring certain types of cryptography over others is sound. The problem seems to be that Schneier prefers exactly the wrong types (ECC). I don’t know that there’s anything more suspicious in all crypto currently than ECC, and while suspicion and educated guesswork might not interest you, the fact still remains that ECC is, historically, the most frequently attacked by the NSA. Review the record.

That's pretty funny, cause he admitted ECC has problems in the near past. To quote:

Breakthroughs in factoring have occurred regularly over the past several decades, allowing us to break ever-larger public keys. Much of the public-key cryptography we use today involves elliptic curves, something that is even more ripe for mathematical breakthroughs. It is not unreasonable to assume that the NSA has some techniques in this area that we in the academic world do not. Certainly the fact that the NSA is pushing elliptic-curve cryptography is some indication that it can break them more easily.

Although honestly, I still believe the math behind it is good and the issue is more related to actual implementations (even the open source ones, that is).

Lol @the fabulous day when the media darling S I Tiulpanov has to engage in damage control because of your articles. Such deliciously careful avoidance, such pain, such horror to be read between the lines. You've come a long way, influencere.

Did you actually bother to read the entire thread on that subject ? There were a score of subject matter experts pointing out the (many) flaws in Bruce's method and proposing alternative solutions that are much more secure. Bruce is a cryptographer, not a COMSEC specialist. He came up with something that with his understanding of the matter would already provide better protection than nothing, undoubtedly looking for comments and tips from those who are more knowledgeable in that particular field than he is.

That's what we do on Bruce's blog: learning off each other. If you find that approach wrong or a reason for suspecting him to be an NSA agent, feel free by all means but then in essence you haven't quite understood the culture of his blog.

And you have understood this "culture of his blog" ? How do I know this, how do you know this ?

In the somewhat more widespread culture as a general concept, the principle is that you write guides on subjects you understand, and lists of questions on subjects you don't understand.

But to get practical : suppose we somehow share a meal in a restaurant. Suppose I owe 125 and you owe 93. Suppose the waitress adds this up to 345. I am definitely going to tell her that stealing isn't a way to treat customers. Are you going to tell me that rather than reprimanding the clueless waitress I should try to understand some putative "culture of clueless waitresses" ? 'Cause if you do I'm just going to imagine that a white knight made out of pure sexual frustration is bubbling up inside.

I found this article interesting and useful as a counterpoint to Bruce's original article. Thank you for taking the time to go through it in such a detailed way.

One thing though - was it really necessary to use the phrase "you definitely want to rape the cam and the mic"? I find the casual use of the word "rape" very jarring and it could easily be replaced with "physically damage" or "disable".

Hi MP! I am not sure if Bruce is somehow collaborating with the US govt but I suspect that he is not - at least willingly - given his long held views. That said I often find issue with some of his advice.

Re full disk encryption: FDE can be useful but you must know its limitations. It augments and not replaces tools like PGP. The biggest threat to FDE is cold boot attacks. That said if you wanted to avoid a thief from looking at the contents of your laptop's HDD if stolen while powered off it is a valid enough option. Another benefit of using FDE is that it makes quick pseudoerasure possible. Where you may need to nuke a disk but only have a few moments to do so using actual erasure tools like ATA_SECURE_ERASE or block level zeroing it could come in handy if FDE is used as instead of needing to erase the whole disk (hours) you can instead just target only the blocks which contain the key (most FDE tools don't directly encrypt your data with your passphrase but instead use the passphrase to encrypt a metadata block/s that contain the actual key. A stretching algo like PBKDF2 is used on the passphrase too in an attempt to slow down brute forcing. Anyway by erasing just the area where the on disk key is stored - and it is usually stored in two locations for redundancy - will leave the data unreadable) storage blocks.

Why else is FDE advantageous? As a safety net in case you are leaking clear text on the disk somewhere. This happens more than most people believe especially on Windows (!) where the volume shadow copy may backup something you have later encrypted and believed to have scrubbed. Ideally you would use a ramdisk or tmpfs volume to put your files in "limbo" when you decrypt them. At least you can then power down the machine when done and be confident that there won't be any remanance after a temperature dependent period of time has elapsed. Of course this benefit is negated if you use say an office program that makes a temporary copy elsewhere on the HDD without your knowledge.

Re Windows - I find it shocking that Bruce trusts Windows. I don't know how anyone "in the know" could have any confidence in Windows or any commercial closed source software for that matter.

The only correct way to carry out FDE is: in hardware. And not in a boot loader writable to by a potentially-diddled OS, and not in a disk manufacturer's "mystery meat" HDD controller. But rather in hardware which you personally install (and, ideally, understand the design of.) In hardware whose function cannot be altered via anything running on the machine, under any circumstances.

Likewise, the keyboard is a piss-poor means of entering credentials for disk decryption, unless said keyboard is connected directly to the above.

most FDE tools don’t directly encrypt your data with your passphrase but instead use the passphrase to encrypt a metadata block/s that contain the actual key

This is actually true of gpg too, it doesn't RSA encrypt anything but a symetric cypher it then uses across the data.

Anyway by erasing just the area where the on disk key is stored - and it is usually stored in two locations for redundancy

Or, for all you know, three. Because that's the deal the producer reached with the NSA, have the data stored in a third, secret spot. Herp.

The moral being, of course, that using tools in ways they were not specifically designed to be used may be productive, and all what IT is about, but is not particularly secure. In this case, the presumption that you may use disk encryption as proxy disk deletion relies on some assumptions about the software that may have or may have not been satisfied by the provider. The ease and comfort with which this sort of mistake is made is why security is and will remain a specialist field rather than a sort of popular mechanics.

Windows (!) where the volume shadow copy may

Footnote #5!

Likewise, the keyboard is a piss-poor means of entering credentials for disk decryption, unless said keyboard is connected directly to the above.

Stanislav: even if the keyboard is directly connected you may still be vulnerable to uh, off channel attacks. Case in point a recent paper where they were able to determine 80% of entered keystrokes through the sound differences between key presses via a hidden mic. Another spoke of using compromising EM emanations from a pin pad to achieve a similar result. Re FDE - yes I agree with you but there are certain applications where it has some utility. Not a panacea, of course but another layer in the onion that is defense in depth.

MP: re gpg - similar concept but gpg is a hybrid cryptosystem that uses asymmetric encryption to guard the symmetric cipher's key. Most FDE implementations (like, say android's modified dm-crypt) use an symmetric cipher to guard another symmetric key. The primary reason they do this is to allow quick "password" changes. Of course the actual on disk key never changes.

I take your point though. Using many of these tools "as advertised" is a recipe for disaster.

Hey would you mind sharing which blog platform you're using? I'm going to start my own blog soon but I'm having a hard time making a decision between BlogEngine/Wordpress/B2evolution and Drupal. The reason I ask is because your design and style seems different then most blogs and I'm looking for something unique. P.S Sorry for being off-topic but I had to ask!

GPG to encrypt files is the only correct way. Any filesystem/files you want to encrypt use an AEAD (Authenticated Encryption with Associated Data) scheme, like GCM or Salsa20+Poly1305. GPG is one of the easiest cryptographic tools to use that provides high-level functionality intended for doing authenticated encryption (AEAD). Lot's of people are backing up LUKS containers to the cloud without realizing how truly shitty it is for anything except FDE on physical drives.

Ferguson points out in his objection to NIST standardizing XTS how it lacks integrity checking, and how an attacker observing block offsets into a given sector can collect useful information as those blocks are changed to do code modification attacks. (Scenario: somebody backs up a truecrypt container to dropbox daily, or a system image like a VM snapshot encrypted with AES-256-XTS). The adversary can deduce where certain binaries to attack are in the container, then randomize a 16-byte chunk of it and brute force that little chunk instead of a full 128bit+ key. Now they can secretly alter the binary, creating a backdoor or anything they want and wait for you to download and run it again. Physical hardware prevents this attack, which is why XTS should only ever be used on a physical device.

Schneier, who is not an agent, created a Tweakable block cipher to defeat this attack and submitted it to NIST SHA3 competition where it was a finalist candidate. Instead of being able to tamper with it undetected like in XTS, his cipher encrypts every block of every sector with its own key (Skein/Threefish). Ciphertext cannot be moved around or reveal itself to an attacker in repeated ciphertext blocks on different parts of the backup/drive.

So if you want to backup an entire container or snapshot find a Skein implementation and use that instead. Use GPG for everything else, or LibreSSL AEAD stream ciphers like ChaCha20-poly1305 or AES-GCM.

Also, not all ECC are created alike. DjBernstein's Curve25519 was specifically designed against timing attacks and the problems of the other curves which are mostly proprietary. His NaCl library is basically the most solid crypto blackbox that exists next to GPG.