I have recently begun studying crypto. If it's one thing I have learned it's that we should not implement our own crypto. Therefore we should look to using existing software and libraries.

When I go to implement something that needs data security, there are many options -- openssl, mcrypt, truecrypt, crypt, pycrypto, bouncycastle... how do I know what is "tried and true" with regard to crypto community scrutiny?

This probably depends on the application, your implementation language, and the data under encryption. What would be generic answers for common scenarios such as basic single-file encryption, full disk encryption, and encrypted network communication?

I could do an audit of source code myself, but I would be naive to say that I could catch even some obvious vulnerabilities.

If TrueCrypt, a high profile piece of crypto software itself, is having widespread calls for a public audit, then what crypto software IS sufficiently audited? It seems to me that if any crypto software should have high confidence with many eyes scrutinizing it, it would be truecrypt or openssl.

And of course, the usual disclaimer; I understand that we can't know all vulnerabilities and bugs ahead of time. And nothing can be infinitely secure. That leaves us with simply a high confidence or low confidence measurement of worth.

Is there a list of high-confidence crypto libraries/software? Maybe confidence isn't well-defined. Is there a list of time-tested crypto libraries/software?

Many good questions generate some degree of opinion based on expert experience, but answers to this question will tend to be almost entirely based on opinions, rather than facts, references, or specific expertise.
If this question can be reworded to fit the rules in the help center, please edit the question.

1

This question is likely to be closed because it's primarily opinion based, and too subjective to answer. However, I'm interested to hear the answers nonetheless.
–
hunterOct 25 '13 at 12:25

2 Answers
2

The question is subjective in nature, and this comment is also subjective. It was too long to leave as an actual comment so I'm posting it as an answer, although it isn't really an answer, it's a comment. This is for posterity, I guess -- this thread is already high in Google searches.

NaCl is probably the most widely respected library. It's authored by someone (Daniel J. Bernstein) with an established track record writing secure software and who understands a lot of practical and theoretical subtle security issues. It's also aimed at high-level use and encapsulates details when possible, as opposed to the more common library approach of just providing an open-ended generic API for a lot of cryptographic primitives. That makes NaCl harder to misuse, which is important since some vulnerabilities are simple innocent misuses of libraries. Few other libraries can make similar claims.

I think good audits are hard to come by. My general impression is that very few people are qualified to perform a good audit, and many of those who are qualified are only qualified to evaluate a few areas. What's more, the well-qualified are in high demand with other things and probably have minimal interest in taking the initiative on an auditing project. It isn't lucrative because a good audit:

takes a long time

is unlikely to pay much (probably nothing unless bug bounties or grants are involved)

unlikely to bring fame to the auditor unless they find something huge

is potentially a liability to the auditor if they miss something that is discovered later

So I don't think there's much incentive to drive high-quality audits.

Popular libraries/applications have had bugs sit unnoticed for a long time, which I think is evidence that audits are rare. History shows examples of bugs going unnoticed for a long time, sometimes even if when we know of them.

The Debian weak keys bug existed for over a year and a half before it was caught. The developer who made the mistake had even explicitly described his change on a public mailing list and asked if the change was secure (the question went unanswered). Let that sink in: A popular open-source library was used for 21 months with a somewhat obvious fatal mistake that was described on a public mailing list.

Look at the list of OpenSSL vulnerabilities (you can filter the more important ones) and use the release dates to match the earliest fixed version against the oldest known vulnerable version. The percentage that existed for 2 or more years before being discovered and fixed is impressive, several lived for over 4 years. Bugs are hard to find and often long-lived.

This isn't library-specific, but I think the lesson holds: The BEAST attack against TLS 1.0 exploited a theoretical vulnerability that had been published 9 years earlier (and fixed in the practically unused TLS 1.1). Even when we know of vulnerabilities, we sometimes don't fix them if it's inconvenient and the attack doesn't seem practical.

I picked on OpenSSL a bit there because it's very popular, open-source, and documented, which made it easy. (Worth noting that the source code is a mess, which probably doesn't encourage anyone to read it.) Other libraries have some similar patterns, and less popular projects may have misleading vulnerability lists because OpenSSL's vulnerabilities are popular when they show up and likely to get a CVE. Is a contributor for a less-popular project going to report every bug they fix, evaluate it as a potential vulnerability, and then file a CVE? I don't know.

In contrast, Truecrypt doesn't have a published list of vulnerabilities, and who knows what subtle fixes are going unreported (or reported under opaque names like "security fix #XYZ") in closed source libraries like Microsoft's. How can we get even a high-level assessment of their vulnerability histories?

My personal feel is that good audits are very hard, very rare, and the majority of auditing is done by the maintainers themselves. To me, the trustworthiness of a library is more centered around who maintains it than how popular (and supposedly more "eyeballed") it is.

Again, this is my general impression and is a subjective response that doesn't fully answer a subjective question.

In my experience bouncycastle is the most stable (java and .net versions). But:

Don't use built in libraries like CSP on Windows because you don't know what are doing (so you can't demonstrate anything at 100%) and in many cases they work using O.S. settings and limitations so the results may be unexpected. Use libraries with readable source code

On the other hand, virtual machine based languages work better on unexpected situations (buffer overflows, etc), so I prefer them to be 100% safe. C/C++ libs like OpenSSL have a very good performance but they have a very hard to read source code and they need to be used with caution in order to protect the code from unexpected situations. Be ready to corrupted data

If you need to choose a library you need to test it by yourself, you need to encrypt a message and decrypt it with another library; you need to do a stress test; you need to do an integration test with your app; etc. There's no perfect library.

In order to conclude, you must think that the weaker point on your app may not be the cryptographic library.