Gotofail Bug: Honest Mistake or NSA Conspiracy?

Right before the weekend, we read about a stupid coding mistake in iOS, leading to Apple developers working frantically to fix it AND release the fix. Various sites have released the lines of code which lead to a serious security issue, with everyone saying that if you know how to read code, the mistake is very easy to spot.

Now, anyone can make a mistake – even something as crucial as the gotofail bug, which essentially “causes most iOS and Mac applications to skip a crucial verification check that’s supposed to happen when many transport layer security (TLS) and secure sockets layer (SSL) connections are being negotiated,” rendering this security layer totally useless.

To be fair to Apple, they very quickly responded and we’ve all probably downloaded and updated iOS at this point.

Here’s where the conspiracy theorists come in.

Was the bug a real bug or was NSA (or some snoopy agency) behind it? The LA Timeswas quick to raise this question, but to be fair, they also shared the opinions of people qualified to make a comment on the problem, Google’s security expert Adam Langley being one of them:

“This sort of subtle bug deep in the code is a nightmare. I believe that it’s just a mistake, and I feel very bad for whomever might have slipped in an editor and created it.”

Probably not as bad as the people responsible for the mistake!

Then there’s “others”, as the LA Times published, who drop names like Snowden and the NSA, and how some researchers have discovered that the bug first appeared in a version of iOS 6.

So what do you think? Does the NSA have something to do with this, or is there a poor developer beating himself up over the mistake (not to mention the facing probability of getting fired)?

About Noemi Tasarra-Twigg

Comments

I don’t know about conspiracies, but here’s something I haven’t seen discussed–where were the rest of Apple’s development organization and processes?

Everyone’s quick to jump on the programmer, but the fact is people make mistakes, they type command-V twice instead of once, etc. That’s why a good programmer does a diff of whatever he or she is committing to the SCM and verifies every change. And then there’s dev testing. If you modify code that checks the validity of a signature, you better test it, and not just test it with valid signatures, but invalid ones, and invalid in at least every way the code checks for.

Then there should be a code review process, which either wasn’t done here or was done pretty sloppily, as the problem *is* pretty obvious.

Then once the flawed code was out of development, where was QA? Again, if code that checks signatures was modified, QA better be testing signature checking. Plus this isn’t a new part of TLS, where was the regression testing?

This is a pretty epic failure, but not on the programmer’s part for making the initial mistake. Mistakes happen. A good development organization the size of Apple’s has a process that programmers, reviewers, and QA should be following, particularly for something as important as TLS. Either that process doesn’t exist (which seems unlikely given Apple’s size), or multiple steps in the process weren’t followed, as there were at least 5 steps that should have caught the problem.

Sloppy doesn’t begin to describe this situation. With a company as big as Apple how can they now be fully trusted to make secure products in the future? They didn’t test the security of the connection to see if man in the middle was possible with 2 versions of IOS and Mavericks? In the auto industry they use crash test dummies to see what will happen real world. They don’t try to hack it to see if it’s secure?