Nope, and this is a pretty good reason to bash linux... we could have had a decent alternative to Windows if all those man-hours had been used on something proper, rather than what began as a minix offspring >_<

Gerry Weinberg said it best: Things are the way they are because they got to be that way.

Torvalds didn't set out to change the world, or prove anything, when he wrote his first minix-based kernal. He wrote the system he could write. And everything followed from that.

Anybody could have changed the roadmap of Linux at any time by just going back and rewriting that kernal "properly."

But they didn't. They took what they were given and built upon it.

Linux isn't really a designed system. You can't design in a group because a true design is the result of one person's vision and thinking. GNU/Linux is an evolving and emergent OS. It uses a Darwinian model of successive refinement and improvement. Those parts of it which work the best (or are hardest to replace) are what survives.

Not terribly efficient or pretty. There's always bloodshed where Darwin's "survival of the fittest" holds sway.

I also think it would depend on what you would want the OS to do. But it does seem that so much of the innovation in the last ten years -- mobile, embedded, server, appliance -- starts with the kernel and peels off whatever is not needed. No need to reinvent the wheel, so to speak. Especially if it scales.

I also think it would depend on what you would want the OS to do. But it does seem that so much of the innovation in the last ten years -- mobile, embedded, server, appliance -- starts with the kernel and peels off whatever is not needed. No need to reinvent the wheel, so to speak. Especially if it scales.

I try to be open minded about a lot of the criticism directed at Linux. But it's getting increasingly harder to remain civil while hearing people bashing a working system when they don't provide anything concrete to show how it could be done better. All I usually hear is how it "sucks" based on some pet theory or a design paradigm that's found in an academic paper - and never anywhere else.

Real world software design and implementation is a very different beast than some pie-in-the-sky whitepaper or PhD thesis.

Look no further than Cobb's relational database model for one example. Most (and possibly every) relational database that has ever been coded deviates to a greater or lesser degree from Cobb's rules. But while these products may not have the mathematical "purity" of relational theory, they do work. And even Cobb has reluctantly admitted, in his more candid moments, that the additional complexity of implementing a fully "correct" relational database product would likely outweigh any benefits to be gained by doing so.

The general rule I was taught about software development budgets ran something like this:

For any non-trivial software development effort:

The first 90% of the project will consume 90% of the available time and budget.

The final 10% of the project will consume an additional 90% of the original allocated time and budget.

And while it's probably not a good idea to generalize, I have rarely seen major software projects where that wasn't the case. Especially if they were system-level development projects.

So what I would politely like to ask all the Linux trolls out there is this:

Next time you feel the need to bash the work of others, could you at least have the decency to show us some code you've written (and are willing to contribute) that does anything better than what's currently being used?

Because until you do, it's a little hard for those of us in the Linux world to see such comments as being anything other than "brag and bounce."

And we can scoop that substance off a stable floor anytime we have a craving.

Submitting a correction or a replacement piece of code garners more credibility, and yields more benefit to everyone than sniping ever will.

^Just out of curiosity, what exactly are today's operating systems lacking that major innovation is called for? I'd agree there's always room for improvments in clarity, efficiency and speed. But on a fundamental level, what needs to be changed? And as long as we're sticking with a Von Neumann architechture, what really can be changed?

If we had true parallel processing it might be a different story. But building transputers for desktop and general servers doesn't look to be in the cards any time soon despite the fact we have known how to build them for something like 40 years. Danny Hillis's brilliant Connection Machine is the only parallel system I'm aware of that actually got some traction. But even with all the excitement and press it got, it still only saw limited deployment on some extremely specialized projects.

Which begs the question: How often is parallel really called for?

Right now it looks like the old fashioned "VN" architechure tricked out with some fancy hypervisor to provide virtual machine environments and limited (as in semi-faked) parallelism is where it's gonna be going. And that's mainly because it's good enough for what we need it for.

And as long as the chips keep on getting faster (and less expensive) - does it really matter? Hardware development costs are cheap when compared to software development expenses. Software costs don't benefit from efficiencies of scale like hardware does. Nor does prior product experience help that much in holding costs down. Software 'reuse' continues to be an elusive goal despite two decades of OOP programming practices. Most system software - and virtually all "breakthrough" applications - are written from scratch because it's still more efficient to do it that way.

And most times it's more prudent to run your "old but working software" on a faster machine than it is to try to improve the code beyond a certain point.

The technological potential we have today is so much greater than it was in the 70s and 80s when UNIX and Windows were designed. And back in those early days, almost every electronic device had it's own custom OS on it, written specifically for that device. And history shows that we don't use something because it's the best. In fact, like you said, we often resist changing to the best and just stick with what we know works.

I'm not in a position to say what we need or what would be better, but I do believe that we're not going to find anything if we're not even looking anymore.

I just recently watched a video by Douglas Crockford that briefly (about an hour and 45 minutes) summarized the history of computing. Time and again he demonstrated how old conventions were cherished and fought for while better methods were outright rejected for the better part of 20 years.

Even to this day we're arbitrarily and artificially limited by hardware limitations that no longer apply. Ever heard of the 80 character limit in programming? That's a holdover from when you would write your code in a punch card and feed it through the machine. The card only had space for 80 characters. We haven't had that physical, hardware limitation for decades, but the tradition gets passed down as "good coding practices" in software. And bash/terminal/command prompts and even old editors like vi still use 80 characters as a limit.

I just think it's time for some fresh ideas as to what an OS can be. I'm not saying I have any good ideas myself, but it'd be great to have some people working on it.

Half the people that are responsible for what we're using didn't know that much about computer technology when they went out and changed the world.

Probably their greatest advantage was they didn't know the "correct" way to do things. And as a result, they weren't hampered by the "fact" that what they wanted to accomplish was "impossible."

The point you made when you said: "I'm not in a position to say what we need or what would be better, but I do believe that we're not going to find anything if we're not even looking anymore." is an absolutely valid argument. One the computer world could benefit from if they remembered that a little better than they have these last 10 years...

So I hope you don't think I was trying to be confrontational, or attempting to put you down in any way, with my previous question. I was genuinely curious as to what you had in mind. That, and maybe a bit of hope you thought of something that had the potential to kickstart a whole new approach to OS design.

There's a story that's told about the early days of atomic energy research. Seems that when ol' Father of the Atomic Bomb Bob Oppenheimer was teaching advanced physics at UCLA Berkeley, he'd sometimes throw a complex problem up on the board towards the end of the class. He'd then invite his students to hang around and try and solve it. Some of the students would usually end up sticking around to try their luck tackling the problem from various angles - but always without success.

As the hours went by, the group would slowly dwindle in number. Eventually only a small cadre of the absolute top students remained. They would continue trying (and dismissing) everything they came up with until finally whoever was the acknowledged "top dog" at the gathering would shake his head and say something like: OK Oppie! We give up. What's the correct answer?

Oppenheimer would beam at them like a proud parent, and then say: "Nobody knows."

When someone would invariably ask what was the point of doing such an exercise if nobody knew the correct answer, he'd reply: Because it's an important problem that needs to be solved. And I'm sure somebody will solve it eventually. It's just that it might have been us here today.

So who knows? Maybe you'll have the breakthrough insight all the "pros" are missing.