Posted
by
timothy
on Sunday May 18, 2008 @12:12AM
from the point-five-year-plan dept.

Nic Doye writes "Dag Wieers responds to Mark Shuttleworth's recent request to ask major Enterprise Linux distributions to synchronise releases, claiming that it 'is no more than a wish to benefit from a lot of work that Novell and Red Hat are already doing in the Enterprise space.' He's confessing to playing Devil's Advocate here, but it is an interesting view from someone with a large amount of experience in the Red Hat/Fedora/CentOS space."

claiming that it 'is no more than a wish to benefit from a lot of work that Novell and Red Hat are already doing in the Enterprise space.'

Red Hat has not provided a consumer desktop distribution in over 5 years. It used to be that most new comers were introduced to Linux via Red Hat. I would wager that today most new comers are introduced to Linux via Ubuntu. When those people who are introduced to Ubuntu have an opportunity to influence decisions in the enterprise, I would expect that many (or most, depending on the environment) are recommending RHEL because of the tremendous brand recognition within the IT world. (I know that Red Hat is not the only game in town, but they are far more prevalent in the enterprise and any other distro.) After all "it's all Linux."

So, I would say that Red Hat has already benefited from Ubuntu's run away popularity in the space the Red Hat vacated 5 years ago. What's wrong with a little reciprocity?

AFAIK probably the biggest reason Red Hat changed the name to Fedora was to eliminate brand confusion with RHEL.

It's not a good business decision to have two similarly labelled products out, especially with software when that usually indicates that one is crippleware. Long after the switch to Fedora there were still stores selling Red Hat 9 because they were confused by the whole Fedora/Red Hat/Red Hat Enterprise Linux thing.

That's what I don't understand about the name change... unless RedHat intentionally wanted to re-brand Fedora as inferior. They couldn't block 'freeloaders' so make the *free* version seem inferior and suddenly 'poor' people would rather pirate RHEL, download centos or go to another distro.Give people more credit, especially those trying Linux for the first time.

That's what I don't understand about the name change... unless RedHat intentionally wanted to re-brand Fedora as inferior.

Red Hat said that they are abandoning the desktop market, as it is not profitable. Fedora is not Red Hat, and Fedora is not being abandoned. Fedora is a bleeding-edge testbed for what will be in the next RHEL. That's why there are over 100 MB of updates every week. Just don't run yum update for a week and see it!

The problem with abandoning the desktop, in my opinion, is that many new linux users are first exposed to Ubuntu. When they go to install a server they will then use either Ubuntu Server or Debian. RPM will be foreign to them.

This abandoning the desktop talk really annoys me. Desktop is just a buzzword all the while redhat maintains and writes half of gnome and desktop software, where you think network manager came from or pulse audio, the suspend features in gdm, UI, who funds and built freedesktop.org? Just cause they dont have a sticker that says "desktop linux" doesn't mean they abandoned it. look up redhats contributions http://fedoraproject.org/wiki/RedHatContributions [fedoraproject.org] sometime.

It's quite simple really they dont want fresh OSS software to be associated with the red hat brand. Fedora will have bugs and be considered "unstable" to many who are looking for no noticeable bugs in thier OS. If fedora was called redhat desktop people would be going around saying i tried to install "red hat" and the instal failed.. they wont differentiate redhat desktop from redhat server in mindshare, it will redhat will lose its brand as a stable serious company. This way I get my fast moving OS and i know what it is, yet newbies wont start branding redhat as a P.O.S cause it didn't install on their emachine.

So ubuntu desktop rules the roost because redhat forgot its roots? Would anyone go as far as to say Fedora according to RedHat is neither 'stable' and/or 'supported'?I just found this, haven't been on the RH site for a number of yearshttp://www.redhat.com/rhel/desktop/ [redhat.com]

Soo... quantaman, what is wrong with Fedora? Seems like an excuse to proprietise an OS without actually taking the code away from those that wrote it. If they just wanted to sell services, surely they'd just offer Fedora w/ paid support?

Does that not go against the whole 'windows in the home, windows in the workplace' rhetoric that keeps getting thrown about though?I'm not saying you're wrong but ubuntu enterprise is hardly losing money, I'm just kinda nostalgic and sad I guess that Ubuntu is basking in the former 'linux' glory of the RedHat of yesteryear.

I do believe in different codebases for each application - different strokes for different folks and all that but going so far as to completely distance yourself from the free community j

claiming that it 'is no more than a wish to benefit from a lot of work that Novell and Red Hat are already doing in the Enterprise space.'

Red Hat has not provided a consumer desktop distribution in over 5 years. It used to be that most new comers were introduced to Linux via Red Hat. I would wager that today most new comers are introduced to Linux via Ubuntu. When those people who are introduced to Ubuntu have an opportunity to influence decisions in the enterprise, I would expect that many (or most, depending on the environment) are recommending RHEL because of the tremendous brand recognition within the IT world. (I know that Red Hat is not the only game in town, but they are far more prevalent in the enterprise and any other distro.) After all "it's all Linux."

So, I would say that Red Hat has already benefited from Ubuntu's run away popularity in the space the Red Hat vacated 5 years ago. What's wrong with a little reciprocity?

Insightful is deserved. Or own the desktop at home, will drag Linux into the enterprise. Something RedHat and Novell have missed completely. If they continue to do so, many might just drag in Ubuntu... I would and will.

If anything, they should put out a home distro cheap and capitalize on Vista's shortcomings.

Is there something I'm missing completely here, or are the comments above complete non-sequiturs?

Neither distro you mention, IMHO, is targeting home users in the way that Ubuntu is. You don't see friendly smiling people holding hands, one or two clicks to download, plain english on the front page and so on, to the degree that Ubuntu's homepage has it. You don't get offered free discs (I got 5 once, left them on the coffee-room table and after two months half the department was using Ubuntu).

Opensuse.org: Nice front page, three options - I clicked download - then I look at a complex table and it fails the WifeTest(TM) dismally.

Fedoraproject.org: When I did my WifeTest(TM), she went to fedora.com, then fedora.org (nice pictures of Mario but no distro). Then we found the site and again, she doesn't know what a freakin i386 is. "I have a laptop, does it say laptop?", she says.

Ubuntu.com: she guessed the right domain, clicked download after looking at the screen for a few minutes, then figured "I must have a standard computer" and started downloading. WifeTest(TM) said she would have bought or requested free CD's except she knew I could burn an ISO for her.

They are good, I agree with you - no worse than Ubuntu, probably. But marketing is everything when like is against like.

try this lot [microsoft.com].
Their disto still seems fairly popular though

Tell me about it, been trying to get this one working for ages. Firstly, my friend tells me it's only available on bittorrent, instead of just downloading it from their site. Which is a bit weird, but whatever.

Where I'm really having trouble is with the package manager. How do I add a software repository in that add/remove programs thing? It doesn't seem to mention what type of packages are compatible with it either. Am guessing RPM or DEB, but which is it, maybe someone could enlighten me?

To be honest, I'm about to give up on this Windows thing, it's just not ready for the mainstream.

Then we found the site and again, she doesn't know what a freakin i386 is.

Fedora is a fairly geeky distro. I use it and like it. However, when my non-tech sister wanted to try Linux, I got her Ubuntu. I still have to help her a little, but for the most part she can handle it herself, which I wouldn't expect with Fedora. Different distros designed for different people. Fedora's a geeky test bed, Ubuntu's for Windows refugees. Gentoo, of course, is for gamers. Pick the one that's right for you, whic

I have Fedora on one system because it handles one scenario more easily than Ubuntu, x86_64 having to install third-party 32-bit software. Other than that, the system is frustrating:

-Their 'releases' seem to mean little. They don't stick to the major revisions of software (Fedroa 8 box updated kernel to 2.6.24 and pidgin to 2.4 for example). As a result, third party drivers can exhibit different glitches or not work at all even during a routine update. Pidgin changed its UI and initially started crashing for me a lot when they went to 2.4. No matter what Fedora path you take, you are submitted to the bleeding edge across the board, not just the areas you are intrinsically interested in.

-They have no interest in helping users have a convenient time with binary software. I.e. annoying to install flash, nvidia, or ati binary drivers. It's one thing of the OSS alternatives are remotely comparable, but they simply are not at this point. ath5k when first adopted was no where near good enough for common usage. The nv driver is a waste of paying the nVidia premium. Ditto for the open source ATI driver until those efforts see fruition. And the open-source implementation of flash is getting closer, but is still far removed from a viable alternative.

All in all, Fedora feels to an extent like crippleware and a rolling beta. Knowing explicitly that as a user you are little more than a free tester for RedHat's for-profit endeavor is annoying. If I were interested in a specific major increase of a package such that I didn't want to wait a few months for the next distro rev, I'd download it myself.

Ubuntu's releases are not perfect (the hardy scheduler annoyance a good example), but the complaints are far less severe and I know when an update might require work. I'm too lazy to have to deal with a major change at a random time. It's the reason why I stopped using Gentoo after a couple of years.

Sorry to rant, but the implication that Fedora is 'geeky' and Ubuntu is not rubbed me the wrong way.

Sorry to rant, but the implication that Fedora is 'geeky' and Ubuntu is not rubbed me the wrong way.

Rant away! That's part of what Slashdot is for!

The reason I call Fedora "geeky" is that it is, as you say, a rolling beta for RHEL. It's not stable, it's not supposed to be and it never will be. Although its marketing people don't like to admit it, Fedora is bleeding edge. That means it's going to take more work from the users to be productive than it would in a distro that's not changing as fast. I see it as a distro for geeks who like playing with their systems and want to have the newest versions of everything, whether they're really ready or not.

As far as getting mp3 support, and other things like that, I agree with you, but I understand their POV. They want to put out a distro that's free of patent, license or other legal encumbrances, and let the user add those difficult programs on their own. I'd rather they were less stiff about it, but they have strict principles and I'm not going to complain about their sticking to them.

Last, I say that Ubuntu isn't geeky because to me, at least, it's designed for people escaping from Windows. It's easy to install, it brings across your Windows Documents if you ask it, you don't need to remember a root password, and for the most part, It Just Works. Some things that are easy to do in other distros seem to be impossible, such as booting into a fully working system at init 3, but that's probably because the average Ubuntu user will never need to do that except in an emergency, so init 3 is set up for repair only. I know that a Windows user with no understanding of Linux can install and run Ubuntu because I've seen it done. I'd not ask that same person to try it with Fedora!

hows that wife test when she pops in her favorite game CD? does Ubuntu get "confusing" again? I guess ubuntu abandoned the desktop market cause its not idiot proof just like fedora and suse.

Point im making is a nice webpage doesn't fix everything for desktop users. How about the ubuntu dev's start contributing the kinda code suse and redhat do to gnome and kde respectively? Does everyone go around saying Ubuntu abandonded the server market cause they dont have hardly any kernel contributions? Fedora is fo

Point im making is a nice webpage doesn't fix everything for desktop users.

Yes, that is true, except she doesn't play games (except the default ones that come with gnome) but browses the internet, chats, plays music, does emails and writes her thesis. Yes, she hates the chat facility and so we go to the macbook and use iChat, which totally rules over the competition.

And I do agree, Linux generally has holes when it comes to things that matter for end users.

The WifeTest(TM) does show that Ubuntu is a royal pain for some things (but all the others are too, as it happens), mainly

I know this is just anecdotal evidence, but my girlfriend recently got a M1530 from Dell, which came preinstalled with Vista. She decided she didn't like Vista and wanted to try Ubuntu (since she sees me using it and was curious). She downloaded the ISO, grabbed one of my blank CDs, burned it, put it in the drive, installed it through their Windows-based setup (not wubi), and was set.The only involvement that I had in this (indeed, this was also the first time I knew she was going to try Ubuntu) was when sh

Insightful is deserved. Or own the desktop at home, will drag Linux into the enterprise. Something RedHat and Novell have missed completely. If they continue to do so, many might just drag in Ubuntu... I would and will.

If anything, they should put out a home distro cheap and capitalize on Vista's shortcomings.

Let me know when you get Ubuntu hardware certified and supported with someone like Dell, HP, IBM, Sun, etc. Oh and certified and supported software like
Red Hat Software Catalog Browse by Company [redhat.com]. Until then Red Hat is probably going to stay on top.

Red Hat has not provided a consumer desktop distribution in over 5 years

Only if you don't count Fedora core, the free version of Redhat that is still worked on.

When those people who are introduced to Ubuntu have an opportunity to influence decisions in the enterprise

You'll find alot of items that are in Fedora core make it into RHEL, Which in some ways makes it into the enterprise, but with the way business works, do you think that end users (comsumers) are really going to be able to have a say in what gets put in / taken out of a enterprize level operating system?

popularity in the space the Red Hat vacated 5 years ago

They never left, They just made two distinct products so people wouldn't have brand confusion.

They never left, They just made two distinct products so people wouldn't have brand confusion.

But what they did just caused more brand confusion. The very fact that you have to make the above statement on Slashdot is proof of that.

They could have separated the products without abandoning the name Red Hat, like calling it "Red Hat Free Desktop" or something. Totally removing their name from the product gives a very clear signal that they want to distance themselves from the product.

Yeah. Or they could have just, you know, made the "free" version exactly the same, software-wise, as the "paid" version; with the difference being that the free version doesn't include the printed manuals and the telephone support hotline.

RTFA.( not like i do all that often, but in this case, dag is a pretty wise head in the arena, and the linked article is pretty short, with a perty picture in the middle to make it all clear ).

its probably the best high level description of just what is involved in long term support for OS distributions/releases, particularly during the overlap periods of multiple distinct releases of any one distribution.

fedora is great, i've used it on my desktop since it was called red hat 6.0... but god, keep it away fr

I think that's an excellent point. I'll take it one step more and suggest looking at how long it took Red Hat to reach the point they're in as far as enterprise, then look at how quickly Ubuntu is moving into that same space. Along with that, Ubuntu has the end user version, which, as you've pointed out, pulls in many people who might later be making enterprise decisions.It's quite possible that in the not to distant future, Red Hat may find they're choking on the dust from Ubuntu. If that happens, that

You seem to be confusing popularity with 'a business that pays our salaries and makes money for our stockholders'. It's a different model: RedHat is in a good position to incorporate new features from ubuntu developers into their Fedora, then their RHEL releases, in a managable and tested way.This helps avoid exactly the OpenSSL/OpenSSH key craziness that just happened to Debian and Ubuntu, because someone got careless migrating the code and commented out an important piece. And avoiding that craziness is e

it all reads to me like canonical has realised just what is involved in real LTS and is begging redhat to provide a common source base for them to leach ^h^h^h^h contribute to where there is less differentiation between distributions.

certainly, ubuntu has gone from zero to hero in a fairly short time, but reading between the lines here, i think the more likely end game is that a future ubuntu will hitch its wagon to the redhat sources rather tha

A consumer desktop? That's what it takes to be a contributor?
Let's take a look at RH's opensourcing of jboss, or check the kernel commit list for @redhat.com email addresses. What about the environmental tools spawned from RHEL, such as func, cobbler, and others?
Then let's look at what folks like Ubuntu have given back. Sure it's a useful and flashy desktop. What project have they opensourced recently? Where's their contribution back to the community, other than their product?

Dag Wieers is known to just about every user of RedHat Enterprise Linux and CentOS, because he and a few other people provide a ton of 3rd part packages that make life more bearable. See:http://dag.wieers.com/rpm/packages.php

He's also one of the people behind rpmforge, which tries to make a unified repo of 3rd party add-on packages. Previously there were a number of incompatible (dependencies and so forth) repositories like atrpms. Dag's work benefits all of us who use RHEL on a regular basis.

I'm assuming that Shuttleworth proposed that every enterprise distro synchronize the release versions of certain core packages like glibc, mysql, gcc, etc, so that it will be easier for vendors to target linux distros with their software releases. In theory it's a good idea, but not everyone has the same idea of what's important and what the right version to release is.

He's also one of the people behind rpmforge, which tries to make a unified repo of 3rd party add-on packages. Previously there were a number of incompatible (dependencies and so forth) repositories like atrpms. Dag's work benefits all of us who use RHEL on a regular basis.

You forgot to mention that the whole reason that there is an rpmforge is that Dag and co. refuse to operate under EPEL / Fedora's rule: Don't introduce packages that are already in the main repository. As a result, Dag's archive and rpmforge will conflict with the base distribution or EPEL on some packages. Once in a while, I'll grab a spec from Dag and rebuild packages for RHEL/CentOS, but as a matter of policy I don't allow rpmforge repositories to be added to any of my systems. His work does make my life easier. Technically. From time to time. However, suggesting that there are no longer incompatible repositories gives him too much credit, I think.

well... as one of the lead developers for CentOS, let me tell you that Dag is MUCH more CentOS friendly than EPEL.
Users are free to choose which repositories to use... BUT... don't confuse Red Hat's corporate interest with good policy. EPEL does not put conflicting packages in EPEL because Red Hat will not allow it and not for any other reason. This isn't bad, CentOS would not exist without Red Hat... you mischaracterize this issue. Also, RPMFOrge and ATRPMS existed for years before EPEL started, and in fact the reason Dag and other are not members are because EPEL demanded that all the current groups in this space stop what they were doing and instead do what Fedora determined was the proper course.
Also... there is a package called yum-priorities that allows you to prevent having core packages updated if you want to take that approach.
The CentOS Project supports Dag's (and ATRPMS as well) in their forming of a new 3rd party repo called rpmrepo .

What you fail to mention is that RPMforge predates Fedora and EPEL by a few years. Between 2002 and 2007 (EPEL) I attracted millions of users using my RPM packages. Packages that existed *before* Fedora came into play. Repositories did not exist back then in the RHEL/Fedora world as they exist today.

When Fedora started I was very interested to help out (read those lists), but nobody within Fedora cared about the millions of Fedora/CentOS/RHEL users I provide packages to and Fedora Extras did not want to support the RHEL/CentOS users at the conception.

Only in 2007 they started to care about RHEL/CentOS users, mostly because Fedora itself is using CentOS for their infrastructure. At that time the Fedora packages were already incompatible with RPMforge packages.

So tell me, what did *I* do wrong here, except caring for my userbase where Fedora didn't.

If the Fedora project wants compatibility, why are they expecting the work to be done by 2 individuals ? I certainly cannot spend that extra effort.

( and really, not reading the article is fair enough, but to just flap keys saying ' too lazy to read, but here's what i think...' someone really needs to hurry up and invent that thing that lets me stab people in the face through the internet [bash.org]

'is no more than a wish to benefit from a lot of work that Novell and Red Hat are already doing in the Enterprise space.'

odd, it was my understanding that GPL'ed software was supposed to be used, not just by a few. I do understand his concern that Canonical and others should be contributing more useful software to the code base that is available but whining every time some distro uses the code that is available, adds to it and becomes popular is very very un-productive.

odd, it was my understanding that GPL'ed software was supposed to be used, not just by a few.

No, it's supposed to be free, as in you can do what you want with the code. If Novell (or whoever) wants to distribute new code to the masses on the seventeenth day after the first new moon of the year, that's up to them. Why should they want to follow some other team's release schedule?

Red Hat, Novell, Debian Foundation, and Canonical should not be constraining each other to that extent, or we'll just wind up with a big bureaucratic mess.

RedHat does work on all levels of the GNU/Linux stack - kernel, compiler, c-library, gui libraries, apps. That means that if RedHat wants a feature (say SELinux) they can coordinate across projects rather than waiting for the right stuff to show up in repositories.

And don't kid yourselves, this is a huge competitive advantage that Ubuntu doesn't have.

Yes. Shuttleworth would benefit from synchronized releases. If there wasn't some advantage for his project, he wouldn't have suggested it. What he's suggesting is that everyone else would benefit too.

Sure, Red Hat puts a lot of effort into hardware support backports. But if Ubuntu, Debian, Novel and Red Hat all standardized on the same kernel releases for their six-month release cycles then hardware vendors would have one platform to target instead of four. That might very well increase vendor cooperation - even to a sufficient extent that Red Hat would get better hardware support than they have now with less investment.

From what I've seen, hardware vendors only target Novell and Red Hat right now, and Ubuntu and Debian are afterthoughts. And frankly the hardware vendors don't do a very good job of targeting those distro's anyway. I'm in a huge enterprise shop and we're always scratching our heads, trying to figure out how to make the latest hardware work in a supported way now when the SW vendors are saying "Yeah, that's available in the kernel now, but it'll be a while before we officially release & support it." We ask the HW vendors about official support from the distro, and they say "Isn't this supposed to be open source? Can't you just build a new kernel that supports this, with these drivers we'll give you?" They don't seem to understand that enterprise shops don't get support from the major distros for custom kernels. Then Sun jumps in every once in a while and says they're going to release their own distro that follows their own (x86) hardware release, just like their SPARC line, but then they fall behind in releasing hardware because it's waiting for the distro... and so it goes. GAH!

We have to figure out how to tame the chaos. Enterprises are shying away from Linux now because of the churn. All the value that is gained by using cheap x86 hardware is lost in the Engineering churn. I think vendors just talking to each other would solve half the problem. I don't know what the rest of the solution is.

"we're always scratching our heads, trying to figure out how to make the latest hardware work in a supported way now when the SW vendors are saying "Yeah, that's available in the kernel now, but it'll be a while before we officially release & support it.""

This is a kernel architecture deficiency, it shouldn't be necessary to recompile a kernel just to use new hardware, ever.

No, this is not a monolithic vs micro kernel argument at all, this is about upgrading drivers and what that requires in Linux. The Linux kernel already has the architecture for adding drivers at runtime as modules, just like OS X, just like Windows, etc. The difference is, on Linux you can't install new drivers easily if at all without backporting large amounts of code, like Red Hat apparently does for their customers.

Your only options are to try to compile new driver code against the running kernel headers, which doesn't usually work because whole subsystems have changed or are entirely missing, or you can rip out the entire kernel for a new one, which doesn't happen unless you do it yourself, by compiling mainline from source, something IT shops aren't likely to do.

Look at the example i quoted, they are saying new drivers got added to the newest kernel but because of the way the kernel works, large amounts of developer time are needed to get new drivers working on existing systems.

This is quite obviously a problem, but the kernel devs seem opposed to the idea of a stable module ABI, there is even a file in the source tree which says something like "you think you want a stable module ABI, but you really don't" its like a jedi mind trick. I understand perfectly well the implications of supporting a stable module ABI, but its necessary in some cases.

The danger is that if the kernel ABI was stable, then the hardware manufacturers would think they were able to get away with releasing drivers only as binary blobs, without Source Code. This of course is highly undesirable. It also raises the nightmare possibility that repairing a deeply-embedded, totally-overlooked yet potentially fatal bug could cause major breakage. (XP SP2, and Vista UAC, I'm looking at you.)

If you want a stable ABI and binary-only drivers, then fork one of the BSDs. Hell, you can even cage the Source Code up and release the whole kernel binary-only. Recompiling something occasionally is a price I'm quite willing to pay for software freedom.

The danger is that if the kernel ABI was stable, then the hardware manufacturers would think they were able to get away with releasing drivers only as binary blobs, without Source Code. This of course is highly undesirable.

Not really its exactly what happens under windows, i hear their hardware support it good. But i do agree that a stable ABI is a terrible idea, IIRC the kernel quys have explicitly said this will never happen.

Not really its exactly what happens under windows, i hear their hardware support it good.

Well, that's what Microsoft will tell you. But then, Microsoft actively persuade hardware manufacturers not to mention that their products work fine with other OSs.

Try finding Vista drivers for a 10-year-old scanner that works perfectly under Linux (despite only ever having been shipped with a driver for '98 and a crappy one at that) and then tell me with a straight face that Windows has the best hardware support.

"The danger is that if the kernel ABI was stable, then the hardware manufacturers would think they were able to get away with releasing drivers only as binary blobs, without Source Code."Wow. So, you do realize the LICENSE the code is under prohibits this from happening, right? As in, they are already not supposed to do this, and distributing the kernel with binary only drivers breaks the license.

Are you saying the license isn't enough, that its necessary to also cripple the driver interface so that anyone

We have to figure out how to tame the chaos. Enterprises are shying away from Linux now because of the churn. All the value that is gained by using cheap x86 hardware is lost in the Engineering churn. I think vendors just talking to each other would solve half the problem. I don't know what the rest of the solution is.

Not quite sure of that. A fortune 500 company I know has ceased new orders for Microsoft and investing in a Linux desktop. It is at the tender stage where where if the CIO gets a massive pricing cut the program could be nixed an not unixed.

Microsoft is under sever pressure to get it's pricing down and quality up. They falter much more, knowing Linux will be the next fad want to have skill. And those that know Linux, getting Ubuntu, RedHat and SUSE working together is much easier than a NT to AD migration, plain and simple.

Just push Open Office and FireFox to the desktops first, nice and immediate MS-Office savings and a nice prep for the conversion. And if the MS salesperson says "Linux what?" You say the OS we are using to replace MS-Windows. Gets a pretty hefty discount if you can show you mean business. Your company wins either way.

I'm in a huge enterprise shop and we're always scratching our heads, trying to figure out how to make the latest hardware work in a supported way now when the SW vendors are saying "Yeah, that's available in the kernel now, but it'll be a while before we officially release & support it." We ask the HW vendors about official support from the distro, and they say "Isn't this supposed to be open source? Can't you just build a new kernel that supports this, with these drivers we'll give you?"
[...]
We have to figure out how to tame the chaos. Enterprises are shying away from Linux now because of the churn. All the value that is gained by using cheap x86 hardware is lost in the Engineering churn.

Then you're doing it wrong. I manage a fairly large Enterprise environment, currently 600+ servers that is about 1/3 Linux, and we don't have chaos. We've run Linux in our Enterprise since about 1999, so we're not new to this. And we're currently consolidating another 500+ servers from other parts of the Enterprise, most of which are Linux. We haven't had these problems that you describe. Why? Because we work with our vendors. We don't just buy any hardware, or any config, and hope it will run Linux. Instead, we have a process to order hardware, and we do our homework first. When we purchase hardware that we know will run Linux, we specify to the vendor "Must be certified for RHEL5" or similar. So the vendor will only give us a quote for hardware that we know will work in our environment.

And do you know what happens when we do that up front? Things work.

This is easy because IBM and Dell and all the other (major) hardware vendors know that Enterprise IT shops like yours and mine run Linux. So they work hard to ensure Linux works with the hardware they sell. And at least with IBM and Dell (we use them a lot) they will certify their hardware for several key Linux distros. RHEL is a major distro with a lot of third-party software support (Oracle, WebLogic, PeopleSoft,...) so it's often certified first.

Heck, at least in the case of IBM and Dell (and I'm sure with other vendors) you can get your Linux support directly from them. One support center if you have problems with the hardware or operating system. And with their third-party relationships, you can often call the same support center for problems with storage (EMC,..), certain software, networks, etc. (Disclaimer: while this is available to us, we prefer to use Red Hat to support our OS, and the hardware vendor to support our hardware components. This is mainly because it makes purchasing licenses simpler. To get Linux support from IBM or Dell, you need to order your RHEL entitlements from IBM or Dell. As a University, it's actually easier for us to order entitlements separately from Red Hat than to do it as a single purchase through IBM or Dell - alas, that's how our purchasing department works.)

And no, we aren't lagging behind in the latest hardware. When the latest blades came out from IBM, they supported Linux. When the latest multi-core systems came out from Dell, they supported Linux. Everything works great from the moment we take it out of the box. We've never "scratch[ed] our heads, trying to figure out how to make the latest hardware work in a supported way" and we don't compile custom kernels. If that's how you support your Enterprise, you need to re-think what you're doing.

It would be a very reasonable effort to develop a standardized "driver API" for drivers and whatnot. Linus could easily issue an API that is standard, year by year. EG: 2007 API, 2008 API, 2009 API, etc. It doesn't have to be by calendar year, but it does have to be CONSISTENT.And makers of hardware could easily write drivers to this API, binary or source, and release these in yum/apt repos, so that any distro could do a quick check for the hardware, and instantly know what repo to go to with a simple looku

Nice idea with only 2 major problems:1) API development cant be done for most of the year (depending on how often they put out a kernel)2) all API changes will be made together meaning that they're a lot more likely to break stuff, and fixing it will be much more difficult too.

There are kernel branches which are stable (bug fix's only) but most people just want the latest and greatest kernel instead so that's what the proprietary drivers work for.

If all the distros standardised on the same kernel, there's a very great danger that hardware manufacturers might turn to releasing binary-only drivers. Then we will all lose out, because we will no longer have absolute control over our own computers.

Maybe the Linux kernel 3.0 will have some cool feature that makes binary-only drivers technically impossible, or maybe we'll see a decompiler soon. Or maybe even, just maybe the existing law which already forbids binary-only drivers will be enforced. But I

Shhh Stallman real world people are talking here. Not all companies are going to publish open source drivers.

Or maybe even, just maybe the existing law which already forbids binary-only drivers will be enforced.

Then we will all lose out, because we will no longer have absolute control over our own computers.

No actually forbidding binary drivers means i lose control over my computer as i no longer have the choice to install them and i no longer have the choice to use hardware that is not supported by open source drivers. Im free to do what i want on my computer, and there is no 'law' that prevents me compiling a kernel to with a non-gpl driver in it.

No actually forbidding binary drivers means i lose control over my computer as i no longer have the choice to install them and i no longer have the choice to use hardware that is not supported by open source drivers.

But there won't be such a thing as "hardware that is not supported by open source drivers", because it will be compulsory for manufacturers to release driver Source Code for all their products.

If they decide to "take their ball and go home" (by just stopping selling their products; but that

But there won't be such a thing as "hardware that is not supported by open source drivers", because it will be compulsory for manufacturers to release driver Source Code for all their products.

What your going to make it illegal for a company to make closed source drivers?

If they decide to "take their ball and go home" (by just stopping selling their products;

no they could just stop supporting linux.

but that really is a case of cortar el pene para agravar los cojones and unlikely to happen in practice)

Erm no, if forced to choose between reveal their closed specs * possibly infringing on 3rd party copyright or loosing 1% of their users, most companies will just stop supporting linux, so its like cliping your nails to keep your hand safe.

. There are landfill sites full of computer hardware which worked fine until Vista came out, but then no Vista-compatible driver was released.

They all worked with Xp though, and some didnt work with linux at all, so what exactly is your point.

The same goes for independent software projects. By far the largest problem across linux distributions is integration testing. Basically quite many things only work properly if you handpick specific versions of components. Introduce a little variation (like package management systems do) and basically you are looking at a unique configuration of packages that has never been tested in that exact configuration before. Feature interaction and other package interdependencies can be really tricky to test against.

The current situation of major distributions hand picking their own versions of packages + introducing distribution specific patches to them only adds to this problem. And then of course independent software developers further add to the problem by only testing on specific configurations of specific distributions. And we all know what a typical developer's workstation looks like. Few projects have the resources to organize broader integration testing.

What Shuttleworth suggests is that merely synchronizing on package versions & release schedule would broaden the scope of integration testing and reduce the amount of mostly non differentiating and needless variation. Effectively it would unify the integration testing work already done across distributions & projects and raise the level of quality across the whole community.

It's hard to see how this can be a bad thing.

A second point that Shuttleworth makes is that independent projects have their own roadmaps for stable releases. Distributions often have to deal with the fact that a nearly ready version of some component is vastly better than the year old stable version. That creates a dilemma: ship the old stable version or let users benefit from loads of useful fixes (that ultimately make the distribution more attractive). Firefox 3 beta 5 in Ubuntu was a good example. Probably a good decision but obviously the combination of OS and browser which at the time were both moving targets cannot have possibly been tested as well as would be desirable for a browser in a major desktop OS.

Wouldn't it be great if Mozilla had known a year in advance that if they'd pushed out Firefox 3 early April 2008, it would have made it into Fedora 9, Ubuntu 8.04, Slackware and Open Solaris release that each ship the exact same version of critical components that Firefox depends on.

Sure, Red Hat puts a lot of effort into hardware support backports. But if Ubuntu, Debian, Novel and Red Hat all standardized on the same kernel releases for their six-month release cycles then hardware vendors would have one platform to target instead of four. That might very well increase vendor cooperation - even to a sufficient extent that Red Hat would get better hardware support than they have now with less investment.

RedHat/Fedora dont want that kinda software anyway. If we have a "standard" that encourages everyone to just make binarys that work on linux instead of encouraging them to give us the source so we can compile it ourselves. See thats the difference shuttleworth doesn't seem to care as much for open code as say fedora which doesn't include anything questionable. Fedora goes out of its way to make things hard for hardware vendors and software vendors so they will be encourage to do things our way instead of u

Synchronizing the major distro releases helps to distribute testing and integration load among the enterprise supported distros while helping upstream developers by giving them fixed integration deadlines. All of that is good for Linux, and helps to keep distros and upstream vendors doing what they're good at, which enterprise loves. Which begs the question: is Red Hat thinking that growing the enterprise Linux space is harmful to its interests?

Right now the landscape for various projects is really a mess. Everyone kind of has their own release schedule and it's different for every project - and for good reason: we're doing this on our own time and therefore why should we care about ship dates?

Well, realistically we do. If projects knew that every May and every November there'd be major distro releases, they'd probably do a good job of freezing their trees in January and July to prepare point releases aimed at being relatively stable.

In turn, there'd be a nice set of releases that Red Hat could pick from and decrease their QA. Otherwise, it's kind of scattershot what the condition of various projects' trees are in.

Synchronizing them can, though- if all your dependencies' timelines meet up you don't have to worry as much about staged upgrades, which increases the stability of your software, reduces development time, and allows you to focus more on how to provide additional functionality than on how to degrade functionality around the absence of a required version of a given package.

"If he can use that same kernel, with the same backports, fixes and regressions tests, Ubuntu LTS does not need to do anything to support the same vendor hardware. Easy, but at the expense of both Novell and Red Hat."

I don't think he needs to be playing devil's advocate. I think what he's saying makes a lot of sense.

When an enterprise buys new hardware, they want the software to "just work" on it. It would be expensive for them to do the work themselves, so they are happy to pay someone else to do it. This is the value-added service that Red Hat gives. This is what an enterprise pays for.

It would be ludicrous to give your *competitor* this service for free *before* you give it to your customer. Sure, once you do the work, others can benefit -- that's part and parcel of free software. But you are allowed (I'm going to even say *expected*) to charge for your services.

Because Canonical and Red Hat are going after the same market, it is inevitable that there will be some overlap of effort. If Canonical wishes to use the work that Red Hat does, they merely have to wait until Red Hat releases.

But what worries me more here is that Canonical seems to miss the point where *creating a working distribution* is a money making opportunity. They seem to see it as a loss leader and they will charge for "support"; where "support" means hand-holding the user. Perhaps I'm wrong. I really hope I am.

Until companies understand that providing solutions and creating capability is the service where all the money is, we're not going to see the explosive growth in Free software that I'm hoping for. I had hoped that Canonical understood this. I still hope it's true, but I'm less optimistic.

then what Shuttleworth is suggesting is the idea of seasons. If everyone can get on the same page a couple times a year, the rest of the time they can go do their migration, vacationing, rewrites, refactoring, day-jobs, etc. If it makes sense for mother nature, it might just make sense for our software ecosystem.

Here is Mark Shuttleworth's insightful response [markshuttleworth.com] when I asked him, "Why would Red Hat cooperate with Ubuntu, especially now that Ubuntu also has its sights set on the server market. Don't they consider Ubuntu a threat?"

I donâ(TM)t think Red Hat would see Ubuntu as a threat, we appeal to different audiences.

Have you ever noticed that competing car dealerships, fast food restaurants and other very similar businesses all setup shop next to one another? You get food courts in shopping malls, for example, where you have all the take-away food places in one area. You would think that the competition would be bad for them. But counter-intuitively, all the restaurants do better when they are all in the same place, and the same is true of car dealerships. The phenomenon is called âoeclusteringâ, and it works because people first decide to go âoelooking for a carâ and then later decide which car, or which dealership. I think Linux is the same - if we coordinate our releases, we send a very strong message to the outside world that will bring more people to Linux in the first place - making the pot bigger for everyone.

Look, I'm sure Mark Shuttleworth is a great businessman and all, but this analogy falls short of the mark.

Car dealerships and hot dog stands are physical retail outlets. A physical retail outlet benefits greatly from being located wherever the customers are. Therefore, some locations (the ones with more customers) are more desirable than others, and businesses will tend to cluster in those locations.

Since not all types of stores benefit from location to the same degree, the ones that benefit more (such

From my point of view only Ubuntu would benefit from such a synchronized release schedule. Well, I guess then it's best that they change their release cycle to Red Hat's. That's not too difficult to achieve as RH announced its schedules quite early.

Novells SLES and SLED are released every 18 months. openSUSE is released every 8 months.Also SLES and SLED are maintained 7 years.

Does this mean manpower? Yes, especialy for the parts of the distribution where no updates are provided anymore. e.g. where the production has completely halted. This has to be maintained by Novell themselves. Just download the source if you so desire and you can copy and paste it into your own code.

How do they do it, except for editing the code? https://build.opensuse.org/ [opensuse.org]. Hey Mark, if you like, you can download it and put your distributions on it, letting the community handle the security updates. It is able to build complete distributions, so you can then build them as often as you desire. Yes, it handles Ubuntu as well.

Maybe I should also write an article so that I can be quoted as "...Bogaboga supports Coordinated Linux release proposal..."

To quote someone else who posted 3 minutes before you:

Dag Wieers is known to just about every user of RedHat Enterprise Linux and CentOS, because he and a few other people provide a ton of 3rd part packages that make life more bearable. See:http://dag.wieers.com/rpm/packages.php

He's also one of the people behind rpmforge, which tries to make a unified repo of 3rd party add-on packages. Previously there were a number of incompatible (dependencies and so forth) repositories like atrpms. Dag's work benefits all of us who use RHEL on a regular basis...