Espionage 3 — Source Code Available to Security Professionals

Not only do we have an update for you today, but we’re super thrilled to announce that as of today, security professionals can obtain access to Espionage 3’s source code! 😀

I’ve wanted to do this for a while, but I never felt comfortable releasing the code for Espionage 2 for a variety of reasons having to do with complexity of the code. Now, thanks to the rewritten Espionage 3, I can say with confidence that Espionage is as beautiful on the inside as it is on the outside, and so I have no problems letting others have a peak inside. In fact, I believe Tao Effect has a duty to its customers to do so.

We know that for software to provide any meaningful security guarantees, its source code must be available to third-parties for inspection. We also recognize that releasing Espionage’s source code can hurt Espionage and its users because of software piracy.

We want to continue giving you stellar customer support and timely updates, so we follow a middle-path by giving security experts access to Espionage’s code so that they can verify its security. We’re also allowing them to distribute unmodified copies of Espionage that they’ve built themselves, so that anyone who doesn’t trust our copy can download it from them. Apply here.

Espionage 3.5.1 Released!

Also on today’s menu, an update! (With more to come!):

NEW: Source code access for security professionals!

NEW: Autolock on screensaver and screen lock!

FIXED: Failure to execute folder actions after folder autolock while Espionage is locked.

FIXED: Don’t unlock folder if an application for a folder action is already running.

EDIT: Thanks to “Red H.” for pointing out that “source code available” != “open source”. The two are quite different, as for something to qualify as open source software, it must be distributed for free. My apologies for the error, we will update all references accordingly, and if we miss one please let us know!

From the looks of it you’re using “open source” to mean the dictionary definitions of the words “open” and “source”. Perhaps this will clarify for you what open source software is, and how it works: http://opensource.org/osd

Thanks very much Red H. for pointing that out! I’ve fixed this in the blog post and everywhere else I could find (can’t do much about the email subject title though, as that was already sent out). If you see any places I’ve missed, please let us know!

I think the term “source available” should become a coined term in common use. People seem to argue endlessly over the dogmatized meaning of “open source” based on FOSS campaigning and have created an expectation that “open source” necessarily means “open to do whatever you want.” That hasn’t been true even in the FOSS circles, because the GPL is quite a massive license and has its host of restrictions as well.

No offense intended, but saying that the source code is available for security experts to look at isn’t very reassuring, in and of itself. Has anyone taken you up on the offer?

There’s this saying in open source that “with many eyes, all bugs are shallow,” but that hasn’t actually proven true, as it assumes that, first, others are actually looking at the source code, and second, they have the skills to spot the bugs. I’m not entirely sure what the incentive for security experts to review your code is, here, as you’re essentially asking them to audit your code for free.

I’m not saying Espionage isn’t a good thing; for most users, I suspect it’ll be more than sufficient. But for those who are worried about nation state levels of scrutiny, I would think TrueCrypt would be a better choice.

Not yet, though that doesn’t surprise me considering we just announced it. The forums are public and open, you’re free to check anytime.

> I would think TrueCrypt would be a better choice.

Better how? Because more people have looked at its source? I don’t think it’s that simple.

I tried TrueCrypt years ago (before Espionage) and couldn’t stand it then. Your comment prompted me to download and try it again to see if they improved it. Nope. It’s still the same god-awful thing I remember, basically unusable in my opinion. Worse, it requires your root password to install a kernel extension. Espionage 2 did that too because it required iSpy, but at least the UX enhancement was somewhat worth it (IMO).

OpenSSL is one of the most audited pieces of software on the planet, but that doesn’t prevent it from being an ugly, buggy mess with security vulnerabilities found every month.

HTTPS is an open protocol based on open standards and open source software, yet most implementations use OpenSSL as their SSL/TLS engine. Worse, the design of HTTPS is terribly broken because of the way it trusts Certificate Authorities (among other reasons).

UNIX had a backdoor in it for 14 years! No one ever discovered it (according to what I’ve read at least). The guy who put it in there eventually revealed it himself.

All that’s to demonstrate that just because something is open source, doesn’t mean it’s inherently safer.

I do believe that software that provides security guarantees should have its source available as a prerequisite, but that alone does not guarantee you that it’s safe.

There are features Espionage has that TrueCrypt doesn’t, and vice versa. Personally, even with a “nation state adversary” I wouldn’t use it, because for me, the UI/UX plays a significant role in the security that an application provides. If I’m not comfortable with how it behaves, or how it requires that I use it, I don’t want anything to do with it. I still haven’t been able to bring myself to use it long enough to find out whether I can encrypt my email with it the way I can with Espionage.

Anyway, to each his or her own. I’m not entirely sure what the reason for your comment was. You’re not being forced to use our software. :-p

Hi, Greg. My apologies, I wasn’t trying to be antagonistic; in most cases, I’d argue that software that people can easily use is far better than that they can’t/don’t, and Espionage is far easier and more pleasant to use than TrueCrypt.

And I’m fully aware of the software vulnerabilities in many pieces of Open Source software; for that matter, there’s Ken Thompson’s famous (well, to some) “Reflections on Trusting Trust” speech of years ago. In using a computer, you’re trusting a great many things, from the software at the top down to the hardware at the bottom, and everything used to build all of those things.

All that aside, though, I think my main issue was/is that I’m not sure what incentive there is for security experts to audit your code. Based on the criteria you give for such a person in the forum post, you’re essentially asking a person with a significant amount of experience in an in demand field to put in some amount of work, in exchange for I’m not quite sure what. And assuming someone does volunteer, how can you trust them, assuming it isn’t, say, Bruce Schneier? How do you know they won’t find vulnerabilities, but instead of reporting them them to you, they sell one or more of them to a third party?

If someone finds a vulnerability in TrueCrypt, it can help make their name, help them get future employment, and so on. (Or sell the vulnerability to the NSA, of course.) If they find a vulnerability in Espionage, they’re obligated to keep quiet about it for three months/until you fix the problem, whichever is shorter. But for people of good will, the potential upsides of auditing the TrueCrypt code are better than that of auditing the Espionage code.

Again, I’m not suggesting that people not use Espionage; it’s much better suited for most people than is TrueCrypt. I use Espionage, and I plan to continue to do so. And that you’re willing to allow people to review your source code speaks well of you; I also use 1Password, and to the best of my knowledge, AgileBits hasn’t made a similar offer. But with all that said, I’m not sure how this offer you’re making here increases the security of Espionage. Saying that you’re offering a bounty to people who audit the code and report bugs, or that you’re paying for a security audit would be more helpful, I would think, although they also have tradeoffs, not least that I’m sure you don’t have huge amounts of money to spare.

Again, I’m really not trying to be antagonistic; this offer is well intentioned. I’m just not sure how beneficial it is/will be.

And assuming someone does volunteer, how can you trust them, assuming it isn’t, say, Bruce Schneier? How do you know they won’t find vulnerabilities, but instead of reporting them them to you, they sell one or more of them to a third party?

These concerns apply to all organizations and software. In many (if not most) cases there’s very little legal threat preventing such a scenario from happening. I think we’ve done the best we can to mitigate it though (by requiring years worth of demonstrated work in the security field, an ID check, a legally binding contract, etc.).

If someone finds a vulnerability in TrueCrypt, it can help make their name, help them get future employment, and so on. (Or sell the vulnerability to the NSA, of course.) If they find a vulnerability in Espionage, they’re obligated to keep quiet about it for three months/until you fix the problem, whichever is shorter.

Would you prefer we request that they immediately and publicly publish the vulnerability?

But for people of good will, the potential upsides of auditing the TrueCrypt code are better than that of auditing the Espionage code.

I think that’s a somewhat subjective statement (personally). In some circles TrueCrypt might be better known and “carry more weight” than the name “Espionage”, but the reverse might be true too.

Saying that you’re offering a bounty to people who audit the code and report bugs, or that you’re paying for a security audit would be more helpful, I would think, although they also have tradeoffs, not least that I’m sure you don’t have huge amounts of money to spare.

Now this is helpful. A concrete suggestion (and a rather good one) that we can actually do something about! 🙂

I think this idea is worth discussing/exploring, and I welcome all feedback regarding it.

First though, I think your assertion that TrueCrypt has a significant audit incentive over Espionage is a bit of a stretch, and (again) sorta depends on the crowd being discussed.

Tampering with the inherent incentive (prestige, and if the auditor is a user, peace-of-mind) by adding $ into the mix could very well be counter-productive.

According to that article, even these new sums pale in comparison to how much Microsoft recently offered for 8.1Preview exploits ($100k).

There are problems, however, with offering monetary bounties:

1. We don’t have that much to offer per exploit. $150 is doable, though I’m not thrilled about the idea.
2. Offering too little would most likely be counter-productive (as demonstrated in the linked tweet, and from personal experience).
3. We could offer more, but studies have shown that offering monetary incentives can actually be counter productive (i.e. get you the opposite of what you wanted).

Expanding on that last point, here are a few studies about money and incentives:

Monetary incentives can lower trust and helpfulness in small groups:

“Subjects basically latched on to monetary exchange, and stopped helping unless they received immediate compensation in a form of an intrinsically worthless object [a token].

As an addendum to my earlier posts, having read a bit more: http://istruecryptauditedyet.com. It sort of demolishes my statement about TrueCrypt being more likely to have been audited, but it also puts a dollar cost on auditing a similar piece of software. As with so many things, it’s a question of who you choose to trust.

Thanks for sharing that Kevin! That was “big of you” as they say. It’s rare to see someone so honest online (in my experience). Usually, upon encountering something that disagrees with their point of view, they just disappear.

You are not the only one struggling with how a non-open source security product can gain some of the trust and security advantages of open source.

One possible concern is that if you allow and encourage third parties to distribute their compiled binaries, you have no way to know that they are distributing unmodified versions. Someone could distribute malicious versions of Espionage without you being able to vouch for its integrity.

If there are multiple builds of Espionage available and their existence is legitimized by your policy, is there a danger that a substantial number of people may end up using one that collects their vault passwords?

I’m not saying that this is a deal breaker. It may be that that risk is worth the other security gains from your scheme. But it is a risk worth acknowledging.

If there are multiple builds of Espionage available and their existence is legitimized by your policy, is there a danger that a substantial number of people may end up using one that collects their vault passwords?

Well, considering that the reason someone would use a third-party build is because of this very concern (i.e. they think our version might have a backdoor), I think it’s mostly a non-issue. We would make clear that we do not (and cannot) endorse any third-party builds, and therefore it’s simply a matter of who the customer trusts more (us, or some third-party).

Also note that we do attempt to protect our customers from such malicious copies through the wording of our contract:

You may build and release copies of Espionage using the original and unmodified source code that we send you (and all associated materials). You may not: sell, re-brand, or add anything to the copies that you distribute that was not included in the original materials that we sent you. Additional terms may apply. See full terms in the contract we send you.

If they were to insert a backdoor into it, they’d be opening themselves to: (1) lawsuit, and (2) public tar and feathering. I somehow doubt anyone would risk that, even the NSA, as they’d be signing a legally binding contract. Then again, the NSA is violating the Constitution and apparently getting away with it…