One of the more valuable tools in the arsenal of computer security is the Airgapped Machine.i

Thirty years ago everyone's machine was airgapped by default, there not being an Internet to speak of. Viruses for Ms-Dos still existed, spreading from one machine to the next through the services of rewritable mediaii, such as flexible diskettes. It is then incorrect to assume that an airgapped machine is immune to all threats by simple virtue of the gap. Airgapping a machine is just one step towards a solution for a particular problem, and by no means a complete cure.

That problem is the machine reading and writing outside of the control of its operatoriii. Removing the ability to Internet is an Alexandrine approach to the Gordian knot problem of how to properly secure Internet communication so that the machine actually obeys its owner - the simple fact of the matter is that there's absolutely no other solution, at least nothing available to the mass marketiv - but even with the Interknot cut, the machine will still need some help to remain secure. Let's look into all the details of that.

I. Hardware.

You want all new hardware, don't reuse stuff that was already online.v

You want old processors, and to a lesser but present degree old motherboards. Don't use stuff made after about 2006 or so.vi

If you are able to brew your own set of hardware, such as for instance out of ARM processors, or FPGA boards which you program yourself, or soldered together Z80s or whatever else, it's a great idea to do so. Just as long as you actually know what you're doing, this sort of arrangement increases the costs of attacking your setup astronomically. It's not likely practical for most people, but if it's practical for you then by all means - and in any case the attempt will likely leave you a lot wiser.

A laptop is better than a desktop, because battery operation increases your resilience to particular surveillance attacks.viii

Ideally you want shielded displays. One way to go about this is correctly caging the monitor.ix Another way to go about this is buzzing the monitor.x The more common way is distance : if your battlestation is in the middle of an otherwise deserted acre of land it's somewhat unlikely that your waste em is being captured by an attacker.

You don't want any extra hardware, such as for instance wireless adaptors or Bluetooth/infrared devices. Physically destroy them if present, preferably by dismounting their controller chip off the board.

You must have good quality hardware entropy sources. Avalanche diodes marginally make that cut, but Geiger counters are the golden standard.

As you can see, some of these are difficult and some even contradictory. Such is life. In practical terms, the serviceable, mass market device would be an old laptop bought off of some anonymous marketplace (Craigslist, whatever), with its various wired, wireless and infrared connectors permanently disabled ; whereas the upscale device would be some FPGA concotion with massive shielding and retina scanners because why not.

It is unlikely that in any practical application the cost differential between the two is justifiedxi, much like the cost differential between a Honda Civic and a Ferrari roadster is not justified in the common usecase - you're going to be waiting for the same lights and in the same traffic jams regardless.

You will download a standard, well known and amply maintained package,xiii in that form intended for non-Internet installation on a reasonably secure machine, and proceed to burn it on a CD / move it to a stick after you've verified its checksums. You add whatever packages you conceivably need and want.xiv

You will install these packages on your airgapped machine, low level formatting all disks in the process. You will immediately proceed to permanently remove all those packages you don't want. Such as for instance avahi and other hipsterish pieces of shit.

You're not advised to use or rely on "full disk encryption". It is immature technology that as of yet does not work.xv Instead, use gpg on a per-file basis.

This is it. Really, it's very easy to do. Learn to awk and learn to vi already.

III. Communication.

The correct way to use your airgapped machine is without electronics. Take the time to type whatever you want into it - if it's not worth typing it's not worth considering, seriously now.

Alternatively, you can use optical means. One way would be to attach a scanner to your airgapped machine, and read printed sheets of paper into it. Damned hard to hijack anything through a black-ink based buffer overflow, you know ? Another way would be to have a QR code reader attached to your airgapped machine, and to read whatever you need into it in the shape of QR codes. This works well if you mostly need small snippets moved back and forth, such as Bitcoin addresses. Damned hard to hijack anything through the crafty use of QR code dots.

Significantly less secure would be the use of sticks and CDs to ferry information back and forth. This isn't really much better than simply having an Internet connection, but even so non-rewritable CDs are marginally better than sticks for the purpose, as they're (mostly) read-only media.

Also bear in mind that the least secure of these (USB sticks) kept Osama off the radar for a decade, so consider footnote 11 again.

IV. Disposal.

Once you're done with your airgapped system, either because you're done or because you have a new one,

You want the hard drive cooked. Solid state will require a serious incineratorxvi, but a traditional platter system once removed from its protective casing can be safely disposed of by for instance boiling in tomato juice. I know that's how I used to dispose of diskettes twenty years ago, and basically that's what it is, a stack of diskettes.

Replace the missing HDD with a new one, install some basic OS and leave the laptop behind in public, such as in a park. Mission complete & the end.

That'd be about it. Questions are welcome below.

———

If you wish to see my credentials by the way, here's an example of an airgapped box. [↩]

Casettes, which mostly fed data into Z80s and their clones, required the user deliberately push a red REC button to allow the system to write out, and as such viruses simply didn't exist in that medium, as they couldn't replicate. Once rewritable media became common, the virus was born.

To fight this exact problem, 5 ¼ diskettes, made out of flexible plastic as they were, included a cutout, which one could cover with purpose-made sticks. If the cutout was covered up the diskette became read-only. 3 ½ diskettes had a movable tab in their rigid body serving the same purpose. The tab was not removed, but instead refined and incorporated because at the time one could simply not have sold user-uncontrollable hardware, as a testament and indication of the much better quality available in the userpool at the time.

And speaking of which, Schneier's "the first company to market a USB stick with a light that indicates a write operation — not read or write; I’ve got one of those — wins a prize" is pisswasser. What's needed is the original diskette tab, a switch you throw rendering the stick read-only in hardware. [↩]

Microsoft is of course Microsoft and everyone involved should spend the rest of their life in jail, but even ubuntu, supposedly this heaven of niceness for the masses requires wget. Why would it require such a thing ? Ah, because connecting to the Internet may be useful, for the OS ? Ah, because hardcoding some quick behind the scenes phoning home is a "simpler" solution and "easier" to do ? Ah, because "users" want it anyway ?

Basically our modern security woes all boil down to a very definite issue with the current mentality of computer programmers - that the ulterior users are all whores, looking for nothing else but getting fucked, and certainly "asking for it".

For this reason I am not particularly interested in all the "women in IT" jazz being pushed these days - you may rape Adria Richards to your heart's content for all I care. What actually matters is changing this mentality that regards computer users as a flock of subservient, objectified, mindless women, the stuff of bad 1970s pornofiction.

We can count on scarcely any progress if over the past century overt sexism in the workplace has been "erradicated", only to be replaced with much worse, much more widespread, practically ubiquitous meta-sexism, embedded in the very design of the tools everyone uses all the time. I for one count myself wronged by all this "progress", as I personally wasn't liable to suffer from any sort of gender discrimination in 1913, but I am certainly going to be treated as a sort of Princess Leila in Jabba's hut should I make the mistake of walking into Windowstown. Or Ubuntu bar. [↩]

One of the reasons is that many bits of hardware, such as hard drives or processors, come with identifiers embedded, such as serial numbers etc. There's no benefit for you from these being previously known to the world. Another reason is that inasmuch as you're trying to keep your daughter from getting knocked up, her never going for a nude swim with a bunch of classmates is a better bet than her going for a nude swim with a bunch of classmates. Irrespective of how "They never did anything. Honest!".

More generally, the sexual approach to the topic is quite likely to serve you well. There's this well documented inclination of the human brain to solve problems much better in familiar contexts, so that one is much more likely to spot an arithmetics error in his restaurant tab than he is to spot the same error in some general discussion of astronomy, with no direct impact to his pocket. Consequently, always think of your airgapped machine as this slutty teenager that's constantly trying to get pregnant so she can be a star on Jerry Springer. With the current state of computing this isn't even far off anyway. [↩]

The reason is that we're pretty certain many if not most of the processors made after that date were intentionally manipulated at the behest of the NSA in order to weaken their cryptographic abilities. [↩]

For many different reasons, such as for instance described in IV. Disposal. Also because the wear-levelling optimisation most SSDs natively employ makes it nigh-on impossible to actually delete anything you ever put on them. [↩]

In many cases the power cord acts as an antenna, which to some degree broadcasts descriptions of what's going on inside the machine it feeds. [↩]

A Faraday cage is basically one huge ground. The cheapest way to implement this is making a sheet metal box and hooking it up to the ground (third pin) of a power strip. A more advanced solution uses dedicated, good quality grounding (such as a few square feet of corrosive resistant steel mesh which you burry in your garden) and a visor (that's to say, an extension of the monitor cage towards you so that you rest your face on it when looking at the monitor). Any way you turn this it's going to be clunky and incomfortable, but on the other hand both CRT and LCD displays can be trivially read with a spectrum analyzer, even through walls. [↩]

Have you ever noticed the effect of a cell phone call if the cell is right next to the monitor ? Basically to implement a good buzzer you'd have to study your equipment, see what frequencies it lets out, and place around it a few antennas making random noise in the same frequencies. [↩]

It's important to remember that old lion joke :

Some guys are hanging out in the African savanna. Suddenly, there's a lion. Everyone starts running. One guy lagging behind asks the guy leading the pack "Hey, why are you running even ? A lion can run faster than any man.".
"I'm not trying to outrun the lion. I'm trying to outrun you."

This point is non-negotiable. Unless you're writing your own OS, you're using Linux for this task. [↩]

Such as for instance Debian Sarge, or a stable RHEL or CentOs or w/e. Pick something you know well. If you must, one of the Ubuntu LTS versions prior to the insanity - that means 10.04 or so is the last allowable. [↩]

This absolutely includes gpg. This absolutely excludes "office suites" and "pdf readers" - get a convertor if you must. [↩]

So apparently whole disk encryption is useless just because someone can plug in an USB stick and patch the bootloader? As in, you gonna stay using your carefully secured machine, regardless if with GPG or with truecrypt, after some chick plugged stuff in?

In whole discussion there's certain key concept missing or just gets looked over - threat assessment. When it's not clear against what we want to protect, then logic lapses as above can take place.

Oh, and where shall I find pre-2006 hardware that was reliably never exposed to internet nor to rewritable media?

I. Your notion of "threat assesment" is broken, with the result that it windows-ies the entire thing. It therefore is not missing, it's pointedly absent. Sometimes we prepare for specific threats, sometimes we construct absolute systems. This is the latter case.

II. Full disk encryption does not work, for a whole array of very good reasons, all of them reducing to "full disk encryption as currently formulated is a self-contradictory proposition". Taking a particularly egregious example of the fail that self-contradiction leads to and pretending like it won't happen to you and so everything's fine isn't logically sound, but more of the same "oh, http is stateless, that's ok, we'll add passwords, which won't work so we create half an encrypted system which still doesn't work so we add 2fa which still doesn't work" crud. These are not solutions, these are antiviruses, which is to say ways to spend money so as to groom the problem into giving you further ways to spend more money on it down the road.

III. I submit implementation would be your problem. This is a spec discussion. Go find it.

@анон That's part of it, also particular sorts of CPU RNG "support" being added and so on.

For the problem of SSD destruction, incineration probably would be a reasonable choice. Most electronics recycling seems to inevitably involve burning the electronics in a barrel to separate things metal from things that are not metal. Generally though the proper destruction of a data-device, even something as naturally fragile as a hard drive isn't a trivial problem. Data manages to be recovered from all sorts of trauma inflicted on intact hard drives. Removing the platter to traumatize it seems like the right answer.

I guess I'll probably have to suppose not everyone knows how to make a proper fire in a barrel though, I might have to write the guide. Should go better than that time I tried to write a guide on something for which too many implementations are broken. Fire is a technology that seems to have gone through adequate testing.

Traditionally oxidation is the disposal method of choice, either through very high temperature (such as from a magnesium fire, oxyacetilene etc) or through a very strong BL-acid (for some applications even lower temperatures and weak acids together work wonderfully, hence the tomato sauce).

The tomato sauce idea is a good one. Definitely more accessible than more extreme options like aqua regia. Chemical destruction via oxidation is good, physical destruction by melting is good. Throwing away the device and declaring it destroyed is not good.

I imagine too many people by default are inclined to bad solutions, as people tend to be. Bad solutions include throwing the hard drive into the ocean or a river. Bad solutions also include burying the intact hard drive or encasing it in concrete. An intact hard drive is a durable thing. Saltwater immersion is a small portion of the data recovery business (head crashes are the meat of the business), but it is not exceedingly rare and generally recoverable. Anything that merely touches the PCB is generally recoverable.

When disposing of a storage medium, the critical part is disposing of the place where things are stored. It's easy with a hard drive, more challenging with an SSD, and not verifiably possible with this "cloud" thing which apparently powers all of the VC geeks and NSA spies boners.

Yeah I suppose the sad story in here is that if you show supposedly literate teenagers the insides of a USB stick they may be seeing it for the first time, and if you demand they point out the actual memory cores to save their lives they may well not be able to. Such a world we live in, kids used to take cats apart to see what they're made out of.

It's actually very easy to defeat any "evil maid" attack: don't have a bootloader on the computer. Simply have your /boot on a physically secure USB key (you could even use one that has hardware encryption, like the Corsair PadLock). It doesn't prevent the installation of an hardware keylogger, though... unless you use a key and not a password to unlock the drives.

As for my home desktops, I have another trick: they have a webcam, and thus any attempt to alter them physically will get instantly uploaded.

Mircea: Given that I'm mostly protecting myself from the police, no; the point is only to know if the system has been compromised. My cheap webcam doesn't give results good enough anyway, at least not with good lighting.

Anon: I don't understand your point. Someone with another /boot won't be able to do anything, as the full disk is encrypted - the evil maid attack is on the unencrypted /boot, which isn't on the computer. You can't evil maid *my* USB stick.

My cheap webcam doesn’t give results good enough anyway, at least not with good lighting.

Which would render it pointless?

If your camera could signal you that someone is at your desk, what you'll do? You'd either need the computer connected to the internet to send it some wipe/block command either some gprs phone bomb(implying you have so important data and you can dispose a body and stuff).

If I'm saying it's a crappy webcam, it's because it is enough. I just need to be sure no one touched it.

The data is encrypted, remember, it's about the evil maid attack. The only unencrypted part is the bootloader; if I see someone touched my computer, I'll just rewrite it. This leaves, however, the cold boot attack.

If you are able to brew your own set of hardware, such as for instance out of ARM processors, or FPGA boards which you program yourself, or soldered together Z80s or whatever else, it’s a great idea to do so. Just as long as you actually know what you’re doing, this sort of arrangement increases the costs of attacking your setup astronomically. It’s not likely practical for most people, but if it’s practical for you then by all means - and in any case the attempt will likely leave you a lot wiser.

A couple of years ago or so we were having a discussion in which I was denying the fact that every conceivable piece of software could be implemented in hardware (I'm really too lazy to look up the post containing the original discussion). Well, I still don't entirely believe in that, but I'll admit that I was (at least partially) wrong, designing applications directly in hardware is feasible, now more than ever. Anyway, this was offtopic.

So, actually, building a FPGA-based design is ridiculously easy nowadays. One can simply get a cheap board (somewhere between $100 and $500, or around $1000 if they're willing to do a better investment) and write an OpenRISC CPU on it and then run Linux on top of the whole thing. It's probably as safe as you can get, short of attempting to build a hardwired CPU out of the design. I was amazed at the thing myself and am going to try it in the next few months for fun.

By the way, I wouldn't trust ARM designs for the same reasons I don't trust Intel hardware. If the NSA tampered with post-2006 hardware, then they might as well have done it with ARM-based stuff built by Apple, Texas Instruments or whoever (maybe not Samsung, since they're Korean; maybe, but who knows).

Great. Now he just needs to stop issuing prescriptions stored with code bars/serial numbers in drug store computers, the drug stores need to stop labeling their products with a unique number and the buyer needs to stop using a credit credit and a DISCOUNT card for "members".

I think you could argue that you'd want an old computer rather than a new one for a variety of reasons. Even better would be an old computer that has never been internetworked. When dealing with an L3+ adversary (say the NSA) it is not unreasonable to expect that they have subverted hardware to their own ends before it has even left the OEM. Given the close ties between Intel and US intelligence agencies I think it is actually highly likely.

As far as x86 hardware goes - the 80386 is the newest processor that can be implicitly trusted. Newer processors have fab processes that result in chips that can't be examined with optical techniques and have integrated cache and other features that mean that Joe engineer can't verify what is happening inside the "black box". For a variety of these reasons many safety critical systems still use 80386 or earlier hardware (and often have "original stock" from thirty years ago in storage and won't accept clones).

It is interesting to note that some of the most "dangerous" and highly guarded systems on the planet still use 35+ year old hardware, e.g. soviet ICBM control systems using DEC VAX.

Perhaps the most disgusting thing is that while the USG was whining about Chinese govt lined companies like Huawei spying they were doing it, and in wholesale.

Of course - we can't do ALL our work on ancient hardware so there is always a security vs productivity tradeoff.

That last part is a crucial aspect : a VAX only works because well... the task isn't really all that complicated. Not to say it's something to be done pen on paper, but it's not molecule folding either.

I'm pretty sure current processors can't be trusted to RNG in the cryptographic sense. Occasionally there are aired fears about more mischievious things such as doctored Ethernet cards (which are dangerous as on many systems can read the entire RAM) and could easily start spilling secrets in response to a magic packet. Then again, such a thing'd be somewhat difficult to conceal, and yet it stays the stuff of legend.

. . . kinda like the Ken Thompson compiler attack. We hear about it all the time yet nobody has (to my knowledge) has actually managed to usefully put it into practice to compromise real user's software.

This is where subtle and clever attacks excel, ala the Debian OpenSSL screw up. If this happened in a closed source product and it was done maliciously nobody would be any the wiser.

Management BMCs often have access to host resources (e.g. the ability to dump RAM) but at least are configured to bind to a secondary management LAN. The new Intel AMT is perhaps more of a worry as it is being cooked into personal (not server or business grade PCs) computers and often laptops and it is bound to the primaru NIC and is often enabled by default. I remember a few years back - okay a lot of years back - I got a lot of fun out of sending WoL packets on my college LAN and watched rooms full of PCs fire up without apparent reason. Why? it was enabled by default and required no authentication.

The most subtle of attacks could be something as simple as not disclosing vulnerabilities. I guess Intel's microcode update system could be leveraged too.

Hell, what about Windows Update? Force MS' cooperation through a NSL or secret court order and get their update servers to push a dirty update to their target. I guess software only hacks are off topic though. Hybrid hacks like using a patched BIOS to create resilient malware (in a similar method to how Computrace ensured its persistence even after reinstallation of the OS through an NTFS aware BIOS shim that patched a file that windows executed during boot)

That was a thing of beauty, I remember reading his paper, feeling sick and not wanting - for the first and last time in my life - to have anything further to do with computers.

I don't think Windows can be a topic of security discussion. All the pretense to the contrary of people paid to put forth pretense to the contrary, Windows isn't part of computer science, Windows is part of computer play. Systems security is counterproductive in computer play, inasmuch as it goes exactly against the very thing that playing is.

Mircea: my sentiments exactly. I was shocked when I read that Schneier was a devout Windows user. I think that anyone with such experience would feel this way but I guess that he is a cryptographer and is likely trained and focus on the math and vulnerabilities of the cryptosystem and not necessarily the OS on which it runs. His reasoning was somewhat understandable, and can be paraphrased as "I use what I know and what I am comfortable using. If I used something I didn't know in and out then I may be even worse off" (paraphrased). This is like saying that because I only know how to drive a small car then I should continue making two trips to carry my big family because I am not confident enough to drive a larger vehicle even though the larger vehicle is superior for this purpose. I respect Mr Schneier a lot and I don't mean any disrespect to him but you can't claim to be comfortable and know the inner workings of a closed source OS. Even if your crypto software is bullet proof you may be subverted from below. Of course you may run an arbitrary "better" OS - say Linux or *BSD and be subverted "from below" by evil hardware something which seems increasingly likely given the #badBIOS revelations of late (if they are true). I am trying to gather as much info on this matter on my blog but it is difficult when there is only a single source and he hasn't been very forthcoming offering physical samples of these evil USB sticks to the community for analysis (dragos is now speaking of "auctioning" them off!). If this is legit I can understand why he is being so guarded - discovering malware of this caliber could be your meal ticket for your entire career - fully paid appearances at sec conferences, doing the speaking circuit, etc. He may not want someone else to steal his thunder. We all need to make money, that's understandable but this shouldn't stifle collaborative research. If I could get a sample of an evil USB I would be more than happy to share attributions with the guy on any research produced, for example, and I think others would be of the same opinion. We don't like people who steal in this community - and taking someone's preliminary findings, elucidating further and releasing the results as original work without acknowledging the work of those before you is theft of intellectual property.

Re Thompson - yes it raises fundamental questions that have been pushed under the rug for thirty odd years by the community. You could argue this is because it is impossible to solve, but I'd rather say that solutions are difficult and it is cheaper to ignore it. If we start seeing bad compilers or bad hardware in the wild (and it seemed what started with Russians reflashing thumb drives to spoof larger storage space to sell on eBay has now morphed into thumb drives with controllers reflashed to 'sploit the machine at the BIOS level) then this will change.

Well yeah, it's confort by belief rather than confort by knowledge, which belies very dangerous mental processes. The man who is comfortable drinking out of lead cups on the grounds that "it's what he's been doing forever and what he's comfortable [ie, used to] whereas using proper earthenware or glassware or ANYTHING ELSE might perhaps give him nervous indigestion" is not by this comfort of his any healthier. For that matter, he may make a fine deacon, but he'll never make a half decent judge.

I doubt the evil sticks thing. For one, making sticks isn't the hardest thing in IT, you'll notice NSA is starting with that as a first project. For the other, who is this guy and why would I trust him ?

I think the reason Thompson's evil compiler is ignored is exactly the same psychological reaction I describe for myself. This is the equivalent of cp in IT, it's c[ompiler] porn. People just don't take well to it, can't think of it without an urge to throw up and so they just...can't. Any scientific approach requires some bedrock of absolute, what Kuhn calls a paradigm. The compiler has been part of the IT paradigm for half a century, and it can't be questioned (yet).

I like your analogy. People don't discuss it because it is a social taboo, and because of that very reluctance to speak of it some very evil people get "cover" by that unwillingness to talk. Even worse the over sensitization that's caused by the taboo results in completely innocent people being labelled as deviates and potentially even imprisoned. I note that there is only one word for such an individual irrespective of whether the other party is a true child or for all intents and purposes an adult, nor is mental age or consent evaluated. One can obviously make the argument that someone who rapes a very young kid is a deviate of the highest order - but what about someone who had consensual sex with a girl who was just days away from being of legal age? The legal reality is both are treated the same, labelled as the same type of evil when the latter is an obvious danger to society and the other is guilty on "paper" but realistically hasn't hurt anyone. And what about the voyuers that download such material? They are routinely rounded up and imprisoned not for participating in any heinous act but for the mere act of possessing digital images. Such a thing is distasteful, perhaps just as or more distasteful than other "shock" images sites of mangled corpses and other misadventures which are popular with uh, certain individuals. Sure, all of these people obviously have a problem if they find any of the above ititilating, but what about the curious who look up, say pictures of a train wreck out of morbid curiosity? Are they just as deranged as someone who trolls the internet for such images 24/7 or is this morbid curiosity a part of the human condition?

Perhaps the most concerning about having essentially "banned" digital data is that something which doesn't exist in a material sense can get your liberty taken away. A blackhat could use a RAT to run a bottorrrent server on your machine and serve up such materials knowing full well that eventually law enforcement will come to your door and drag you away. We heard about a few guys attempting to buy drugs on Silk Road to frame Krebs - but what about this idea? What of cache and proxy operators who may have no knowledge of what traverses their networks or is stored in their cache?

I often consider that the damage done by making a subject "too controversial to discuss" is far greater than any embarrassment or social upheaval a frank discussion of the issues might bring.

But I am way off topic. Regarding this guy and his claims of an evil USB thumb drive - I agree that such grandiose claims need proof otherwise he is just another crackpot. He does have some credibility having organized Pwn2own but that's not to say he is infallible and his analysis is correct. Given he appears unwilling to distribute said infected media to others for confirmation it does look very fishy. I guess we will just have to wait.

I do wonder why - just like the Thompson compiler attack - humans seem to love to just pretend that something difficult to talk about and hard to solve isn't an issue. It reminds me of how the banks tried to deny that CC card number generators weren't affecting their bottom line in the 80s, or ATM card skimmers in the 90s .... non-bank lenders in the 2000s and now in this decade cryptocurrencies posing a threat to their very existence (and phishing scams and ID theft causing massive issues when they continue to send statements in cleartext emails)

[...] by post by Bruce Scheier and Mircea Popescu on things that are not air gapping and things that are air gapping. In Mircea’s article he mentions his method of oxidizing hard disk platters by boiling them [...]

[...] any keys generate a new one, but if it finds problems with a lot of your keys probably get a new3 machine and definitely set it up with some *nix or a different *nix system if you started this [...]

[...] form of this separation involves composing and generating ciphertext on dedicated hardware like an airgapped machine or Cardano and then passing the ciphertext to an online machine for transmission. The popular [...]

Add your cents!»

If this is your first comment, it will wait to be approved. This usually takes a few hours. Subsequent comments are not delayed.