New effort to fully audit TrueCrypt raises $16,000+ in a few short weeks

Crypto prof: "We have nearly, but not quite enough to get a serious audit done."

For nearly a decade, TrueCrypt has been one of the trusty tools in a security-minded user’s toolkit. There’s just one problem: no one knows who created the software. Worse still, no one has ever conducted a full security audit on it—until now.

Since last month, a handful of cryptographers have discussed new problems and alternatives to the popular application. On Monday, this culminated in a public call to perform a full security audit on TrueCrypt. As of Tuesday afternoon, that fundraiser reached more than $16,000, making a proper check more likely. Much of those funds came from a single $10,000 donation from an Atlanta-based security firm.

“We're now in a place where we have nearly, but not quite enough to get a serious audit done,” wrote Matthew Green, a well-known cryptography professor at Johns Hopkins University. How much would “enough” be? “That depends on how many favors we can get from the security evaluation companies,” Green continued on Twitter. "I'm trying to answer that this week."

In case you haven't noticed, there's a shortage of high-quality and usable encryption software out there. TrueCrypt is an enormous deviation from this trend. It's nice, it's pretty, it's remarkably usable. My non-technical lawyer friends have been known to use it from time to time, and that's the best 'usable security' complement you can give a piece of software.

But the better answer is: because TrueCrypt is important! Lots of people use it to store very sensitive information. That includes corporate secrets and private personal information. Bruce Schneier is even using it to store information on his personal air-gapped super-laptop, after he reviews leaked NSA documents. We should be sweating bullets about the security of a piece of software like this.

Cyrus Farivar
Cyrus is a Senior Tech Policy Reporter at Ars Technica, and is also a radio producer and author. His latest book, Habeas Data, about the legal cases over the last 50 years that have had an outsized impact on surveillance and privacy law in America, is due out in May 2018 from Melville House. Emailcyrus.farivar@arstechnica.com//Twitter@cfarivar

116 Reader Comments

People trust open-source for good reason, but bitlocker has indeed been professionally audited. By the NIST, so take that for what it is worth (they approved the NSA sponsored RNG, but have since expressed outrage).

People trust open-source for good reason, but bitlocker has indeed been professionally audited. By the NIST, so take that for what it is worth (they approved the NSA sponsored RNG, but have since expressed outrage).

I was under the impression that the NSA audits crypto for the NIST, and recommends the adoption of crypto standards. Is that incorrect?

This may or may not be a stupid Idea but here goes; Why don't we take the TrueCrypt software and repackage it as an upgraded version that we sell for a profit! the intended goal being to lure the original developers out of the shadows!

People trust open-source for good reason, but bitlocker has indeed been professionally audited. By the NIST, so take that for what it is worth (they approved the NSA sponsored RNG, but have since expressed outrage).

I was under the impression that the NSA audits crypto for the NIST, and recommends the adoption of crypto standards. Is that incorrect?

All of their websites being shut off makes this hard to research, but I understand the NSA just consults when it suits them to do so. The Computer Security Division (CSRC) of the NIST must have their own security experts. Government likes redundancy.

The basic functionality of Truecrypt needs to be fully rolled into all file systems code by default, it shouldn't be an extra piece of software. That way, Linux / BSD / etc. would come straight out of the box with full blown TC-like encryption and containering abilities.

Building unsecured systems, going forwards from here, is like thinking we can build apartment buildings and businesses, but make the door locks optional and leave them off by default, and buy them later if we feel the need to bother. It's preposterous really. The era is dawning that NOBODY should trust anybody, and real bulletproof security methods and practices need to become the minimum standard baseline of what computers do for us, and how we use them.

The basic functionality of Truecrypt needs to be fully rolled into all file systems code by default, it shouldn't be an extra piece of software. That way, Linux / BSD / etc. would come straight out of the box with full blown TC-like encryption and containering abilities. That would help alleviate the issue of code auditing a separate and complex package, since full and proper security would be a basic part of the workload of the OS community. If that seems like extra work, I think we've been naive and slacking off so far to NOT do it.

They do, for the most part, in the form of the dm-crypt/LUKS. It's just not forced on you during the install process or anything, but it's built in.

Finally. My issues with TC have always been the anonymous developers, it was closed source, and once it was open sourced, there had never been an audit by competent experts. These things together made me suspicious. If there's an audit, the anonymity becomes a nonissue to me.

Can anyone come up with a credible -- and legitimate (or at least reasonably ethical) -- reason for the original developer(s) to choose anonymity?

The only notion that I can come up with, is that whoever wrote the original code was working for some corporation(s) that would have claimed ownership and asserted control over any such work by its employees.

Can anyone come up with a credible -- and legitimate (or at least reasonably ethical) -- reason for the original developer(s) to choose anonymity?

The only notion that I can come up with, is that whoever wrote the original code was working for some corporation(s) that would have claimed ownership and asserted control over any such work by its employees.

Anonymity could be a way to prevent governments (US or otherwise) from leaning on the developers and insisting that they insert a backdoor.

Edit: Honestly, as someone who lives in the US, I would definitely choose to stay anonymous if I were writing any sort of security oriented software.

It's encouraging when open-source projects get this kind of attention. Maybe it will help to draw out the people behind TrueCrypt, which would be especially good since the software hasn't been updated in 20 months.

There's a gaping hole in encryption right now, though, because TrueCrypt does not yet support encrypting GPT boot volumes. Linux has this through dmcrypt, and I think TC works if you install using BIOS-compatibility mode. But for anyone who bought a system new or installed clean using GPT, there are significant limits. PGP is a possibility, but it's expensive, running well over $100 (and I think closer to $200) for a version that will encrypt GPT boot partitions. This places it well out of reach of most users.

TrueCrypt's license has apparently been reviewed by Red Hat's legal team and found strongly wanting, which is the reason that Fedora and RHEL have TC-compatible software but not TC itself in their repos. Details are thin, but the following link labels the license "non-free".

In 2010 the TrueCrypt website removed all reference to Ennead and Syncon and replaced them with "The TrueCrypt Foundation". However there is still information about them floating about the internet. For example this interview on wolfmanzbytes.com:

Quote:

WolfManz611: Whats your position in the TrueCrypt project?

Ennead: I'm 29 and my main project roles are the following: Project Administrator, Developer, and Designer. I am also responsible for the documentation and the website.[...]

WolfManz611: Are you guys working on a Linux version and if so hows that coming along? will it be the same as in features as the windows one or better?

Ennead: Yes, while I am currently working on the Windows version, Syncon is finishing the work on a Linux version. It has been in alpha testing for several weeks and it appears to be very stable.

The basic functionality of Truecrypt needs to be fully rolled into all file systems code by default, it shouldn't be an extra piece of software. That way, Linux / BSD / etc. would come straight out of the box with full blown TC-like encryption and containering abilities.

It's not cross-platform, but it's free, open source, and very easy to use.

Very easy -- I discovered it by accident, the first time I needed to format a USB stick on Linux, using the gui disk utility. I was very pleased with how easy it was to use, and it worked quite well -- and I was very disappointed when I found it wasn't even recognised by Windows.

Finally. My issues with TC have always been the anonymous developers, it was closed source, and once it was open sourced, there had never been an audit by competent experts. These things together made me suspicious. If there's an audit, the anonymity becomes a nonissue to me.

Can anyone come up with a credible -- and legitimate (or at least reasonably ethical) -- reason for the original developer(s) to choose anonymity?

The only notion that I can come up with, is that whoever wrote the original code was working for some corporation(s) that would have claimed ownership and asserted control over any such work by its employees.

It's likely for reasons similar to why Bitcoin was proposed under a pseudonym; creating any sort of software that's intended to conceal or obfuscate data, essentially the entire purpose of encryption, is likely to attract a lot of unwanted attention. Perhaps the original developers simply have an interest in protecting themselves from legal or extra-judicial threats that could arise from working on it? Governments especially have a lot of power, and I imagine could easily coerce a software developer to abandon a product, or worse, insert some subtle back-door or exploitable weakness. Maybe they feel -- accurately or otherwise -- that they mitigate this risk by not identifying themselves.

A full audit would definitely help to provide confidence in the program's security, of course, but I don't personally think that TrueCrypt's being developed anonymously or pseudonymously is on its own enough to merit extra concern.

Building unsecured systems, going forwards from here, is like thinking we can build apartment buildings and businesses, but make the door locks optional and leave them off by default, and buy them later if we feel the need to bother. It's preposterous really. The era is dawning that NOBODY should trust anybody, and real bulletproof security methods and practices need to become the minimum standard baseline of what computers do for us, and how we use them.

The basic functionality of Truecrypt needs to be fully rolled into all file systems code by default, it shouldn't be an extra piece of software. That way, Linux / BSD / etc. would come straight out of the box with full blown TC-like encryption and containering abilities. That would help alleviate the issue of code auditing a separate and complex package, since full and proper security would be a basic part of the workload of the OS community. If that seems like extra work, I think we've been naive and slacking off so far to NOT do it.

They do, for the most part, in the form of the dm-crypt/LUKS. It's just not forced on you during the install process or anything, but it's built in.

Interesting, we encrypt every PC at my work with TrueCrypt. I simply assumed that it was a proven piece of software but to find that it's never been audited and that the authors are unknown is very interesting. We deal with PHI (personal health information) daily and have quite a bit of time and effort invested in security, it would be less than funny to find out that TrueCrypt has a huge backdoor.

RCook your position is shockingly similar to mine. I had been considering implementing TrueCrypt as an alternative solution to Windows Ultimate for Bitlocker on the machines. I like saving them money, it's a NFP. I suppose now that I'm aware of this information, I wouldn't truly be able to make the case that TrueCrypt is a good alternative.

Though, I wonder if having *nothing* at this point would actually be more secure than having TrueCrypt, since we cannot claim for CERTAIN that is does not have a backdoor. TrueCrypt would indeed stop the physical threat of someone walking out the front door with a laptop that's not locked down, though. I'm just trying to see it from all angles here...

One must ask the same question of TOR. If say there is a compromise in TOR's encryption it wont matter how many levels of encryption the message goes through. intercepting the packet will reveal the data.

Is TOR's node chaining truly random. Could one determine the next chain with knowledge of the last chain?

These concerns are based on the background of TOR it was created by the NRL afterall Was the original code ever audited? according to its sponsorship page it was still sponsored by various governments agencies through the NRL up until 2010.

So if you want to set up a method of intercepting highly secret traffic by dissidents and whistleblowers then hand them the tools to do the encryption.

It is a classic example of giving a foreign diplomat a gift of a bugged laptop.

The first to spring to mind is Sendmail. It shipped as part of a lot of linux distros for years yet nobody ever looked over the code.

Sendmail was continually plagued by security problems, but the most glaring ones were fixed well before Linux became commonplace. Sendmail simply predated security as a concern on the internet and it was very difficult to retrofit.

Sendmail simply predated security as a concern on the internet and it was very difficult to retrofit.

Tru dat.

I was just pointing out that being open source or having "many eyes" able to review something doesn't automatically mean that bugs will be found or that it's inherently more secure. It's why I'm pretty happy Truecrypt is being professionally audited.

Speaking of which, how does OpenBSD afford it's auditing? Is their auditing some kind of lesser effort, or audited to a lesser standard? Is it a different kind of auditing that's performed by OpenBSD developers?

I was about to use TrueCrypt last 2009, but after skimming the web and downloading the sources of almost all TrueCrypt versions, I can't determine the country of origin let alone names of the developers.

So I just downloaded the sources for my future reference and if any need will arise, but I haven't installed it due to my suspicion of the origin or the intention of the developers. Hence, I created my very simple encryption tool. It is very simple but considering there are infinite number of ways to scramble data, it follows there are infinite algorithims that can be implemented, and hence infinite ways to crack my ciphertext, and it follows that the resulting ciphertext is <b>uncrackable</b> as long as the algo is just in my head and not in public domain.

Which is safer, an encryption tool with unknown algorithm or an encryption tool with known algo yet unknown authors?

We need a test case. We need someone to encrypt their PC with TC, commit a felony involving said PC, have it confiscated and analyzed by the FBI...

The FBI has tried and failed to decrypt Truecrypt.

If I remember correctly, the hard disk which was encrypted by TC came from Brazilian government, and they asked a favor for FBI to crack it.

Well, nobody knows if FBI did crack that hard drive, most probably they did, but they remained silent and didn't divulge the content to Brazilian government so that all of us will trust TrueCrypt.

Your own encryption is probably badly implemented and thus unsafe as hell.

As to this there are other cases in US where they were unable to defeat TC or at least admit in open court that they were able to defeat it. This is FBI, god knows what NSA can do as I doubt that they would expose such ability for a few criminals.

Unfortunately while this audit is needed, we shouldn't let that obscure the longer-term need for a continuing audit process. As others have noted, some versions of TrueCrypt have been audited to some extent before; the problem is that it has to be done over again with every new version. So this audit will be obsolete in 6 months or so, and then what?

There was a 11th Circuit Court of Appeals court case a year or two ago in which the FBI admitted that they couldn't break Truecrypt encryption. The court ruled that the disk owner had the 5th Amendment right to remain silent as to what the key was.

But that only works if the disk owner chooses a STRONG key. When disk owners have chosen short, simple, weak keys the FBI has been able to brute-force Truecrypt passwords under those circumstances.

Another important problem is that disk owners sometimes write down their key (FBI finds the key when searching the disk owner's house) or tells a spouse what the key is (FBI asks spouse and spouse tells FBI what the key is).

Truecrypt works really well in the absence of user stupidity, but a dumb user will very quickly be compromised by the user's own error. 64-character keys are permitted, but you still have to be smart enough not to use "password" as the encryption key...

We need a test case. We need someone to encrypt their PC with TC, commit a felony involving said PC, have it confiscated and analyzed by the FBI...

Already happened multiple times, the FBI could never crack Truecrypt altough in some cases it got access because of weak password chosen by the user. I have little doubt that Truecrypt is rock solid, but a professional audit is worth to be supported with a little bit of money.

Which is safer, an encryption tool with unknown algorithm or an encryption tool with known algo yet unknown authors?

The last. Every single time.

It's really difficult to roll your own crypto with not only a safe algorithm but also a safe implementation. Your "security through obscurity" does not work - never has and never will. Not in your case either (unless perhaps if you're Bruce Schneier).

One thing about true crypt that people don't realise is that its impossible to verify the binaries they release are from the source they release. This is because of the way the windows versions are signed. They do not give out the keys for signing the exe, so its impossible to compile a bit for bit match of the binary they offer for download.

By strict definition this makes true-crypt not open source at all, as elements needed to make the exact binary are never released.

Building unsecured systems, going forwards from here, is like thinking we can build apartment buildings and businesses, but make the door locks optional and leave them off by default, and buy them later if we feel the need to bother. It's preposterous really. The era is dawning that NOBODY should trust anybody, and real bulletproof security methods and practices need to become the minimum standard baseline of what computers do for us, and how we use them.

The basic functionality of Truecrypt needs to be fully rolled into all file systems code by default, it shouldn't be an extra piece of software. That way, Linux / BSD / etc. would come straight out of the box with full blown TC-like encryption and containering abilities. That would help alleviate the issue of code auditing a separate and complex package, since full and proper security would be a basic part of the workload of the OS community. If that seems like extra work, I think we've been naive and slacking off so far to NOT do it.

Building unsecured systems, going forwards from here, is like thinking we can build apartment buildings and businesses, but make the door locks optional and leave them off by default, and buy them later if we feel the need to bother. It's preposterous really. The era is dawning that NOBODY should trust anybody, and real bulletproof security methods and practices need to become the minimum standard baseline of what computers do for us, and how we use them

While asking me to trust the guy that is paying me to rent a building is slightly different that asking me to trust no one will peek/copy/steal my data from an un-encrypted laptop. While it is totally my responsibility to secure my own data (and i totally agree with you in OS TC style OOTB integration) the fault is mine should I choose to leave my data un-encrypted.

[EDIT: WTF ate half my post....]

Also, I have used TC for years in an enterprise environment to ship massive amounts of data on disks to other locations. I sincerely hope that the audit of the TC code is able to occur and then to occur on regular basis after updates of the base codeset. It would be a huge let down to have the audit to discover a backdoor in TC.

I was about to use TrueCrypt last 2009, but after skimming the web and downloading the sources of almost all TrueCrypt versions, I can't determine the country of origin let alone names of the developers.

So I just downloaded the sources for my future reference and if any need will arise, but I haven't installed it due to my suspicion of the origin or the intention of the developers. Hence, I created my very simple encryption tool. It is very simple but considering there are infinite number of ways to scramble data, it follows there are infinite algorithims that can be implemented, and hence infinite ways to crack my ciphertext, and it follows that the resulting ciphertext is <b>uncrackable</b> as long as the algo is just in my head and not in public domain.

Which is safer, an encryption tool with unknown algorithm or an encryption tool with known algo yet unknown authors?

Writing cryptographic software that doesn't have a fatal bug is difficult to the point that if you're not a professional cryptographer the risks are very high that you'll make a mistake somewhere even if you're working from a written specification of an existing algorithm. If you created the algorithm yourself, the odds are it's only marginally more secure than double ROT13.

I was about to use TrueCrypt last 2009, but after skimming the web and downloading the sources of almost all TrueCrypt versions, I can't determine the country of origin let alone names of the developers.

So I just downloaded the sources for my future reference and if any need will arise, but I haven't installed it due to my suspicion of the origin or the intention of the developers. Hence, I created my very simple encryption tool. It is very simple but considering there are infinite number of ways to scramble data, it follows there are infinite algorithims that can be implemented, and hence infinite ways to crack my ciphertext, and it follows that the resulting ciphertext is <b>uncrackable</b> as long as the algo is just in my head and not in public domain.

Which is safer, an encryption tool with unknown algorithm or an encryption tool with known algo yet unknown authors?

..... If you created the algorithm yourself, the odds are it's only marginally more secure than double ROT13.

ROT13 made me smile..You could be correct, but as what I've mentioned there are infinite ways to create your own encryption algo, I can even use 8 random files, (dll, jpg, gif, etc...) combine/encrypt them together and have a +10meg of random gibberish keyfile... how do you crack a ciphertext which was encrypted with unkown algorithm?

I bet FBI have difficult cracking that Brazilian harddisk secured by TrueCrypt because it uses some unknown algo also, in addition to built-in cipher of TC.

I was about to use TrueCrypt last 2009, but after skimming the web and downloading the sources of almost all TrueCrypt versions, I can't determine the country of origin let alone names of the developers.

So I just downloaded the sources for my future reference and if any need will arise, but I haven't installed it due to my suspicion of the origin or the intention of the developers. Hence, I created my very simple encryption tool. It is very simple but considering there are infinite number of ways to scramble data, it follows there are infinite algorithims that can be implemented, and hence infinite ways to crack my ciphertext, and it follows that the resulting ciphertext is <b>uncrackable</b> as long as the algo is just in my head and not in public domain.

Which is safer, an encryption tool with unknown algorithm or an encryption tool with known algo yet unknown authors?

Writing cryptographic software that doesn't have a fatal bug is difficult to the point that if you're not a professional cryptographer the risks are very high that you'll make a mistake somewhere even if you're working from a written specification of an existing algorithm. If you created the algorithm yourself, the odds are it's only marginally more secure than double ROT13.

Even if you're a professional. No matter how smart you are, there will always be someone smarter who spots some vulnerability you didn't think of. Thus the importance of open source; hopefully all those eyes find the problems you couldn't see in your own.

We need a test case. We need someone to encrypt their PC with TC, commit a felony involving said PC, have it confiscated and analyzed by the FBI...

There are a number of cases of this; for example, in cases of laptops seized by drug warriors. TrueCrypt has never fallen. Insofar as I know, all of the cases have been resolved by rubber hose decryption methods.