Chances are the person who comes into possession of a stolen laptop or desktop won't know what OpenBSD is at all ... but perhaps they'll be motivated to find out. encrypted /home /tmp and /swap should go a long way toward safeguarding data.

While the chances of losing a computer are small but significant. The chances someone with the skill to crack encrypted data (and not even on Windows) is quite an order of magnitude smaller.

I've seen too much hardware go missing to think encryption is anything but necessary.

While other sites allow threads to meander in whatever direction anyone wants to go, this site has traditionally frowned on such a practice. This is known as hijacking discussion. The reason we politely ask members to respect this request is because new respondents, having different perspectives, questions, & needs frequently take discussion in a different direction as presented by the original poster. This helps provide continuity, it helps when searching is done after the fact (& there is a lot of searching done on this site...), & allows the original poster to continue discussion in whatever direction they originally intended.

Data in motion: Data in use -- in RAM on a processor, being sent over a network, etc.

Data encryption: obfuscation of data to make it meaningless to anyone but the originator and a recipient with the proper decryption tools and keys to remove the obfuscation.

Keys: Used in combination with encryption / decryption tools to obfuscate or make the data clear. Keys vary in capability, usefulness, complexity of use, ad infinitum, ranging from simple passwords to complex key systems and certification technologies.

Authentication: techniques used to identify users of data-at-rest, or senders and recipients of data-in-motion.

Authorization: techniques used to permit access to data, or to encryption/decription keys.

Key Management: The decisions made about who or what stores or has access to keys, can change keys, and all of the varied authentication systems for those keys, from personal possession or knowledge to certificate authorities and "trust" management. (In my opinion, this is often the most complicated and difficult to understand part of any encryption/decription system. And I further believe it is often the weakest link in any encryption/decryption decision system.)

------

Based on those simple definitions, it should be clear that any encrypted data at rest is not protected unless a protection regimen is instituted. One has nothing to do with the other.

------

There are times when data-in-motion is unencrypted: when the data is being processed by an application, or displayed, or retransmitted. Often, this is not well considered by a data architect.

One example that comes to mind is the recent very vocal and noisy NFSv4 thread on misc@ that began here: http://marc.info/?l=openbsd-misc&m=128818996830209&w=2 -- one of the thread posters who kept it going was happy with his NFSv4 implementation because he thought his strong authentication technology secured his organization's medical data. He not only confused authentication with encryption, eventually he mentioned that he'd combined NFSv4 with CIFS retransmissions -- disclosing to the knowledgeable that his implementation was also retransmitting unencrypted data, or data-in-the-clear, to Windows workstations on his open networks.

That actually poses something I've often wondered, is there any operating system that can encrypt the contents of RAM, and decrypt on access? I bet the performance would blow hard but as a proof of concept it would be interesting!

To my knowledge once you turn off the power, whatever was in a PCs RAM fades away a short time later, and not everything in the world is obviously a PC. But during runtime, the only serious protection is trust in your operating systems correctness, which isn't always OpenBSD.

All program code executes in RAM, obviously, so having it encrypted means that the decryption "on-the-fly" program would have to be in memory unencrypted.. and it would have the information necessary for decrypting other memory, DMA buffers for devices would have to be unencrypted.

On at least OpenBSD, the swap partition is encrypted.. only when a process is relocated back into RAM does it get decrypted.

Really though, physical security means putting a lock on your door.. all bets are off if they break into your premises and steal your machine.

Really though, physical security means putting a lock on your door.. all bets are off if they break into your premises and steal your machine.

This thread reminds me of a math professor I had in college. He was a bit of a security / paranoia nut. He logged into his computer using a barcode he had randomly generated. He would scan the barcode and it would represent his password.
He did this, as it was described to me, in the event someone of authority ever visiting his home. He could destroy the barcode, and not be able to log in to his computer, explaining that, "I do not know the password; it was randomly generated."
Ironically, in a worst case scenario, his computer case also had a magnet on a platform inside. He could pull this platform out of the computer from the outside, and the magnet would fall on top of his hard drive, supposedly 'wiping' the contents of the drive.

Ultimately security, as a whole, is an idea we create for ourselves. At least it was for my math professor.

That actually poses something I've often wondered, is there any operating system that can encrypt the contents of RAM, and decrypt on access? I bet the performance would blow hard but as a proof of concept it would be interesting!

Godel's incompleteness theorem comes in to play here. You have to step up and "outside" the system to achieve the goal, otherwise you run into the problem that BSDfan666 outlined (answering the question from within the system). The encryption would have to be handled in hardware, and it would have to have some way to randomize the key each boot. It basically could be implemented as an extension to whatever ISA you target. It would introduce significant overhead, though, so I would imagine that an industry that prides itself on performance would not be terribly keen to introduce a feature that detracted from performance while only providing minimal security (i.e. you can't freeze the RAM and analyze it later, and even so if you could determine the key during runtime you could still decrypt the contents of RAM).

This just in from Bruce Schneier's blog, regarding a talk by Whitfield Diffie at a recent ACCU conference. Highlights mine.

Quote:

The third watershed is cloud computing, or whatever you want to call the general trend of outsourcing computation. Google is a good example. Every organization uses Google search all the time, which probably makes it the most valuable intelligence stream on the planet. How can you protect yourself? You can't, just as you can't whenever you hand over your data for storage or processing -- you just have to trust your oursourcer. There are two solutions. The first is legal: an enforceable contract that protects you and your data. The second is technical, but mostly theoretical: homomorphic encryption that allows you to outsource computation of data without having to trust that outsourcer.

Diffie's final point is that we're entering an era of unprecedented surveillance possibilities. It doesn't matter if people encrypt their communications, or if they encrypt their data in storage. As long as they have to give their data to other people for processing, it will be possible to eavesdrop on. Of course the methods will change, but the result will be an enormous trove of information about everybody.