I read son Gizmodo this morning that the team has removed 90,000 lines of unused code in the last week - if that's true, then OpenSSL has been appallingly managed for something which so much of the Internet relies on for security.

Well that is Gizmodo for you. They removed support for any other platform other than OpenBSD, plus of course they removed some older tech (SSLv2 etc). Sure if you take some library and cut out 90% of the other platform support, you can easily cut out tons of code .

"Last year, the foundation took in less than $1 million from donations and consulting contracts." While i know the big companies should have given more, cash is clearly not the problem here.

Would you happen to have details on their income ?
I ask because from what i read it seems the president of the OpenSSL Foundation, Steve Marquess claims they take in less than $2000 a year in outright donations and sells commercial software support contracts.
In fact he goes onto say, 'The media have noted that in the five years since it was created OSF has never taken in over $1 million in gross revenues annually.'

it is nowhere near enough to properly sustain the manpower levels needed to support such a complex and critical software product. While OpenSSL does “belong to the people” it is neither realistic nor appropriate to expect that a few hundred, or even a few thousand, individuals provide all the financial support. The ones who should be contributing real resources are the commercial companies[5] and governments[6] who use OpenSSL extensively and take it for granted

I hate to pander to popular prejudice here, but what this does do is poke some very big holes in the utopian dream of open source software.

Whatever the reasons for underfunding and poor engineering, crap management is absolutely endemic in open source software. Mob rule and anarchy doesn't work very well, as this incident shows.

I've been banging on for years that bad management, or more to the point just no real management at all, the single biggest problem facing open source software, for dozens of reasons, and nobody gets it.

I read son Gizmodo this morning that the team has removed 90,000 lines of unused code in the last week - if that's true, then OpenSSL has been appallingly managed for something which so much of the Internet relies on for security.

90k lines of code is nothing in such a project, I wouldn't read too much into that. For example, Google dropped 9M lines of code out of Chrome/Webkit once they forked it to Blink by dropping code for archs that Webkit supported but which Chrome didn't need to - so they could just be removing code for other architectures, seeing as openssl is compileable on almost anything.

It is not 90k lines not needed for OpenSSL in general. Most of it is not needed for LibreSSL running ONLY on OpenBSD. OpenSSL runs on Linux. LibreSSL doesn't. OpenSSL runs on Windows. LibreSSL doesn't. Removing 90k lines to strip the code base of Linux or Windows compatibility is not a source code optimization, nor has anything with security whatsoever.

Leaving in legacy code because 'well you know - busy' doesn't cut it when you're charging commercial clients for the code or to support the code.

Charging for open source code, since when did that happen ?
Charging to support the code, i think that maybe a grey area.

Yes they offer Support Contract's but the money from those all goes to the people directly providing the technical support services and to current active OpenSSL team members.

IMO The OpenSSL Software Foundation has been severely underfunded. Whether that is down to bad management when it came to acquiring funding, or the lack of support from the larger community is difficult to know. Although looking on the OpenSSL web site at who has helped fund the project shows a very small list of just four companies, i personally would have expected that list to be filled with some notable names.

Let's not forget that one of the supported platforms OpenBSD removed was big-endian x86.

Note: the x86 family is little-endian.
Note: it's not actually POSSIBLE to make a processor that is both big-endian and x86-compatible.

It takes a certain kind of special to implement support for an imaginary mirror-universe version of one of the most ubiquitous processor architectures in the world and insist there's actually a reason for this to exist.
Whatever their programmers were smoking, I want some of it.

What's the issue with only supporting OpenBSD at the moment? On their website, they state:

Quote:

Originally Posted by libressl.org

Multi OS support will happen once we have:

Flensed, refactored, rewritten, and fixed enough of the code so we have stable baseline that we trust and can be maintained/improved.

The right Portability team in place.

A Stable Commitment of Funding to support an increased development and porting effort.

Surely it's better to strip back to one system, make sure that's as stable and bug-free as possible, then extend to other systems? LibreSSL is, after all, part of the OpenBSD project, so it makes sense that they would support that first.

Unless, of course, you're worried about further forks by other teams to support their own preferred OS, leading to a whole different mish-mash of OpenSSL interpretations?

What surprises me is that giant companies like Google and Facebook that apparently use OpenSSL to secure their services weren't doing their own audits. If you are that big and well resourced, and are relying for critical security functionality on an external project, shouldn't you be putting some effort into ascertaining that the external project is actually providing you with security?

Ideally the big guns would collaborate on this, or perhaps put the resources into ensuring the OpenSSL foundation was up to the job, but in lieu of either of those things surely they should at least be doing some rigorous internal testing and code audits?

A distribution contains a lot of packages. Thats a lot of lines of code. One of the perceived 'benefits' of open source is that an organisation/individual can take advantage of that prior work (they don't have to reinvent the wheel) and reduce their costs. As more machines use that software the consequence of mistakes/poor design decisions in building that sw has a greater effect. The fact that this problem was discovered (even after two years is better than none) shows that companies are (gradually) realising that they have responsibilities to contibute/maintain that body of code (or pay someone else to do so).