Posted
by
samzenpus
on Thursday April 10, 2014 @08:28PM
from the only-human dept.

nk497 (1345219) writes "The Heartbleed bug in OpenSSL wasn't placed there deliberately, according to the coder responsible for the mistake — despite suspicions from many that security services may have been behind it. OpenSSL logs show that German developer Robin Seggelmann introduced the bug into OpenSSL when working on the open-source project two and a half years ago, according to an Australian newspaper. The change was logged on New Year's Eve 2011. 'I was working on improving OpenSSL and submitted numerous bug fixes and added new features,' Seggelmann told the Sydney Morning Herald. 'In one of the new features, unfortunately, I missed validating a variable containing a length.' His work was reviewed, but the reviewer also missed the error, and it was included in the released version of OpenSSL."

The design of the feature looks like a backdoor too. A heartbeat function with a variable length payload, and there is a superfluous field for the payload length, and all running on top of TCP, which already has a keep-alive function? And then the feature contains a "rookie mistake", but still passes review. Yes, we totally believe you. It was a mistake.

All I know is the organization I work for has prohibited use of C or C++ for mission critical software for years now. The languages we use would not ALLOW code to execute which tries to copy 64K from a 2 byte sized container.

Part of software engineering is to use the right tool for the right job. When a buffer overrun can destroy the security of the entire internet, you damn well better not be using C as your tool. Or assembly language for that matter.

Well pretty much anyone can start a lawsuit. But what damages are they suing for? Reimbursement of the purchase price?

If you're using it, you're agreeing to the license:
* THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY
* EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
* PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE OpenSSL PROJECT OR
* ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
* STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
* OF THE POSSIBILITY OF SUCH DAMAGE.

Now I am not a lawyer, and there are always folk looking for an opportunity to sue, but the license terms surely set them off on a bad start.

As an open-source dev myself, I often wonder why the fuck I do anything useful for others when they'll just turn on me the moment their toys don't work exactly as desired because -- gorsh -- I'm not perfect, though I work very hard to be.

Welcome to Engineering. Scott Adams (of Dilbert fame) best summarized this disconnect between commendation and blame in the Engineers Explained [boulder.co.us] chapter of his book:

Engineers hate risk. They try to eliminate it whenever they can. This is understandable, given that when an engineer makes one little mistake, the media will treat it like it's a big deal or something.

Examples of Bad Press for Engineers

Hindenberg.

Space Shuttle Challenger.

SPANet(tm)

Hubble space telescope.

Apollo 13.

Titanic.

Ford Pinto.

Corvair.

The risk/reward calculation for engineers looks something like this:

RISK: Public humiliation and the death of thousands of innocent people. REWARD: A certificate of appreciation in a handsome plastic frame.

Serious question: Why don't you become the new maintainer yourself, if you honestly believe you can do a significantly better job at it than the current person(s)?

I don't do it myself because I can not guarantee that I wouldn't make even worse mistakes. I'm glad there are people out there who are willing to do the job, and I'm in no position to bite their heads off when they mess it up. And you're probably glad that I'm not a maintainer of anything even remotely security-related:-)

I glanced at some of the OpenSSL C code, in particular the new code that introduced this bug.

I don't disagree about the 'coding style' issue, but that kinda misses the point. The points are:

Theres a memcpy() - where is the bounds checking? Hello? Its not 1976. We all know memcpy is dangerous. Where there's a memcpy there should be a bounds check... even in a fart app. If the project has secure in the title there should be paranoid anal-retentive checking of both the source and destination buffers.

The code uses data that has come from teh interwebs, - again, where's the obsessive-compulsive validity checking on everything that comes in?

However, that's still not the point. Programmers make mistakes - and this bug was at least a bit more subtle than the usual one where the bad hat sends an over-length string.

The problem is with the oft-made claim that Open Source security software is extra-safe because the code is public has been seen by many eyeballs. That claim is dead. Possibly crypto experts have been all over the actual encryption/decryption algorithms in OpenSSL like flies on shit - however, clearly none of them looked at the boring heartbeat stuff. That shouldn't be the death of open source, though - Windows is proprietary and look at the sheer terror caused by the prospect of running Windows XP for one day after the security patches stop...

The date that it was added to the OpenSSL codebase is very close to the time when the leaked NSA documents claim that they had a 'major breakthrough' in decrypting SSL. I would imagine that they are not responsible for introducing it, but do have people doing very careful code review and fuzzing on all changes to common crypto libraries, so I wouldn't be surprised if they'd known about it (and been exploiting it) since it was originally released.

Doctors are human. We hold them accountable for their mistakes. Engineers are human. We hold them accountable for their mistakes. Indeed, we hold just about everybody accountable for their on-the-job mistakes and the consequences of their mistakes result in everything from terminations to criminal proceedings.

So, when should programmers be held accountable for their mistakes, and how should be respond as a society?