If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Random Is Faster, More Randomness In Linux 3.13

11-17-2013, 01:30 AM

Phoronix: Random Is Faster, More Randomness In Linux 3.13

The /dev/random changes went in for the Linux 3.13 kernel and this pull request was even interesting for the very promising next kernel release. While not in Linux 3.13, it's mentioned the Linux kernel might also end up taking a security feature from the FreeBSD playbook...

The Android flaw was bad for Bitcoin wallets as this bug caused the "randomness" in Android devices(Android based on Linux kernel!) to be predictable. That way private keys could be generated that had been generated "randomly" so the attacker could thus "generate randomly" the exact same private keys that somebody else had generated on the Android device.
This randomness and entropy improvement to Linux is of course good but Google is too stupid to care right now, so don't generate your private keys on Android, but generate them on Linux, *BSD, then you import them to your Android device. Android is a sinking ship...

Comment

The Android flaw was bad for Bitcoin wallets as this bug caused the "randomness" in Android devices(Android based on Linux kernel!) to be predictable. That way private keys could be generated that had been generated "randomly" so the attacker could thus "generate randomly" the exact same private keys that somebody else had generated on the Android device.
This randomness and entropy improvement to Linux is of course good but Google is too stupid to care right now, so don't generate your private keys on Android, but generate them on Linux, *BSD, then you import them to your Android device. Android is a sinking ship...

sorry, but you obviously have no idea what you're talking about.

There have been at least 3 indirection layers between /dev/random and the Bitcoin application. Just because there is a bug in dalvik, doesn't make /dev/random faulty. As much as it can interfere with a good rant, taking off the fanboy goggles is good for your health.

Comment

While I never understood how people can find predictability in /dev/random on any system, I also don't understand how people who care about pure random numbers don't make a USB device that actually generates purely random numbers. I remember hearing about how it is possible to get pure 100% random number (in a digital perspective) using a very tiny amount of an element like Americium and have some sensors that read the gamma rays that are emitted. While nothing in physics is 100% unpredictable, these gamma rays are generated at the atomic level, which is so hard to measure that you might as well call it perfectly random. I figure the only problem with this type of device is it's likely affected by temperature. You still won't be able to predict the exact number it generates but you can at least figure out the range it would be in. So for example, if it's 20C in the room, you might get a number from 10000 to 50000 but if it's 30C you might get a number from 20000 to 80000. I could be wrong though.

Comment

While I never understood how people can find predictability in /dev/random on any system

People can find predictability with /dev/urandom, not with /dev/random. But that's kind of by design... /dev/urandom was created so that reads from it would never block even when the system hasn't generated enough entropy. Most apps don't need cryptographically strong randomness, and those that do should be using /dev/random.

Comment

While I never understood how people can find predictability in /dev/random on any system,

They find patterns that weren't earlier identified in the entropy source. It's not like you can exactly predict what comes out, you can just more accurately identify the limited entropy in the pool enough to be able to enumerate the possible outcomes.

I also don't understand how people who care about pure random numbers

Which should be everyone who uses the internet - the security of things like TLS relies on it. Unless you like just handing out login credentials to everyone between you and the site, or your credit card info.

Easier said than done. "Randomness" is something that is easy to disprove once you know what's wrong with it, but essentially impossible to really prove. The problem isn't a good source of randomness -- the existing I/O devices we have are good enough on that front -- the problem is being able to distinguish which of the numbers you're getting are and aren't random. How does the linux kernel know whether 10101010 was a byte of patterned data or random data? By itself, it's impossible to tell, since all random data consists of equally likely bit strings, and any finite numerical sequence, including bit sequences, can be represented by a function (and in fact infinitely many functions). The only thing it can do is run statistical tests on the sequences it's getting, which can assign probabilities to how likely it is that the given sequence is random.

I remember hearing about how it is possible to get pure 100% random number (in a digital perspective) using a very tiny amount of an element like Americium and have some sensors that read the gamma rays that are emitted.

Possible, yes, but not that marketable, especially given that again, current hardware sources generate enough randomness (the milliseconds between keystrokes when you type, for example are usually good enough when you get enough of them). Again though, there are a million things that could go wrong with your "pure" entropy source: faulty hardware, electromagnetic interference in the line, chain reactions that skew the weight, etc. etc. such that the raw 1s and 0s the kernel gets aren't random enough for, say, a good GPG key. Though that sort of device would almost certainly alleviate most of the problems that people extol with current /dev/random implementations today (though not all, such as insufficient entropy at boot time, since it takes a while to gather entropy).

While nothing in physics is 100% unpredictable, these gamma rays are generated at the atomic level, which is so hard to measure that you might as well call it perfectly random.

No, in this case you're using quantum mechanics, which means the process itself is perfectly random. It's other factors you'd have to worry about.

I figure the only problem with this type of device is it's likely affected by temperature. You still won't be able to predict the exact number it generates but you can at least figure out the range it would be in. So for example, if it's 20C in the room, you might get a number from 10000 to 50000 but if it's 30C you might get a number from 20000 to 80000. I could be wrong though.

People can find predictability with /dev/urandom, not with /dev/random.

No, they find predictability with both.

But that's kind of by design... /dev/urandom was created so that reads from it would never block even when the system hasn't generated enough entropy.

Sort of correct, but /dev/urandom still runs the existing entropy pool through PRNGs to extend it (/dev/random uses PRNGs too, but not to extend entropy, only to sanitize it). That means it's not predictable "by design," and really you should never be able to tell the difference between the output of /dev/random and /dev/urandom, it's just that /dev/urandom will generate PRNs using entropy amounts arbitrarily smaller than the desired output. For example, a machine that is somehow really broken and just feeds 0s to the entropy pool will have the same amount of entropy at boot as for the rest of the session coming out of /dev/urandom. But if you just use up all your entropy on a normal machine after you've had some at some point, /dev/urandom would be far more secure of a source that the broken machine.

Most apps don't need cryptographically strong randomness, and those that do should be using /dev/random.

"Should" being the opportune word here. I can't recall where I saw the statistic, but a depressing number of cryptographic android services were found to use /dev/urandom to improve performance.

No, in this case you're using quantum mechanics, which means the process itself is perfectly random.

This very much depends on your interpretation of quantum mechanics. Even interpretations which include collapsing wavefunctions do not necessarily include randomness/nondeterminism. (Also the wave function can be restored if you forget what you measured according to some interpretations.)