"Is Ubuntu an operating system? Last week at EuroOSCON, Mark Shuttleworth gave the closing keynote outlining what he believes are the major struggles faced by the open-source/free-software community. During his talk, it became clear that Ubuntu is trying to achieve a radical shift in the software world. Ubuntu isn't trying to be a platform for mass-market application software: it is trying to be the primary provider of both the operating system and all the application software that a typical user would want to run on his machine. Most Linux distributions are like this, and I think it is a dangerous trend that will stifle innovation and usability."

"Most Linux distributions are like this, and I think it is a dangerous trend that will stifle innovation and usability."

Actually, most operating systems are like this, including OS X and Windows. They aim to provide everything the "typical" user would want. What would you think of an OS that does nothing when you plug in an iPod? Or a digital camera? Or, when you receive a presentation it can't open it?

That may be called choice for you, but for the typical user it's called a "worthless" OS. And you can still do a minimal installation of Ubuntu and then install the packages as you wish (apt-get install xorg, apt-get install gnome-desktop-environment, apt-get install gnumeric, etc).

The software in Ubuntu, a single CD distro, is just a starting point. There are two obvious ways for an inexperienced user to install the software of their choice from the Ubuntu and Debian repositories which, last I checked, contained about 19,000 packages. They can use the drop dead simple "Add or Remove Software" application that appears at the bottom of the "Applications" menu on the panel. Or they can use the more powerful, and still easy to use, Synaptic, in the System->Administration menu.

Beyond that, gdebi is enabled by default. It allows 3rd party vendors to provide debs which the user can download to the desktop and then doubleclick to install in a nice friendly graphical fashion.

I would also recommend Jenny Craig to the author of the article, but that might be considered a personal attack. ;-)

"""Note that Ubuntu ships and installs proprietary drivers by default without giving users a choice."""

That is incorrect.

I have a few devices on my machines which are not supported in an OSS way, but do have binary only drivers. Ubuntu does the best it can on these devices, installing the xorg nv driver for my nvidia card, the xorg radeon driver for my ati express 200M, and the rather rickety bcm43xx driver for my notebook's wireless.

In no case has Ubuntu resorted to binary only modules on the install.

Ubuntu (and its associated community) *does* make installing the proprietary bits straightforward for the users who have considered the issues and have chosen to install them.

The statements were directly taken from the Ubuntu website under the section "software installed by default" . It has been confirmed my Mark Shuttleworth in a recent interview itself that Ubuntu installs proprietary drivers by default.

"""The statements were directly taken from the Ubuntu website under the section "software installed by default" . It has been confirmed my Mark Shuttleworth in a recent interview itself that Ubuntu installs proprietary drivers by default."""

Ok. I see where the confusion is being introduced.

Yes, there is a package called linux-restricted-modules which does get installed on some machines.

The thing is... it's not used in the configurations.

NVidia's module may be sitting on the disk. But it is not loaded because Xorg is set up to use the free xorg driver.

If the user chooses to use the proprietary driver, he can. He will have to download the other piece of the puzzle: the nvidia-glx package. The situation is the same in the ATI case.

It doesnt clarify anything to me. Go read http://www.ubuntu.com/ubuntu/licensing. Read the section titled "Software installed by default". Note that it says that "binary only" hardware drivers are installed by default even though they are marked as restricted.

So again, proprietary drivers are in a different restricted repository but installed by default. It might not be used in the configuration. I make no claims about that. It is a fact that proprietary stuff gets installed by default without a choice.

"""So again, proprietary drivers are in a different restricted repository but installed by default."""

Installed but not used. Unless you have a concrete example of proprietary software being used without the user's consent, I simply have to declare this argument as one not worth having.

Does proprietary software sitting on a disk, unused, make the user "dirty"? If a tree falls in the forest and no one is there to hear it, is there a sound? How many angels can dance on the head of a pin?

Many users have said it is configured by default. Whether or not it does, installing proprietary software without the user's consent in a distribution that claims to support free software is misleading and false.

The video card is a ATI FireGL T2.
At the moment i don't have access to the notebook so i don't know the exact manufacture of the wireless card but it needs the madwifi driver.

>I have an ati express 200M in my notebook, and it most certainly gets configured with the OSS driver.

Maybe you have installed the system without internet connection? The Ubuntu CD contains only Free Software but Ubuntu enables restricted by default in the source.list and if you have a internet connection during installation Ubuntu will download and install non-free software if Ubuntu think it's necessary.

"""Maybe you have installed the system without internet connection?"""

No. Always with the network available.

I've done this with both Dapper (for which there is no 3D support for the 200M) and with Edgy (which has an OSS driver that supports 200M for 3D but is not as fast as fglrx).

While I very *strongly* prefer OSS software, I'm not a zealot about it. If Ubuntu *does* sometimes install and use proprietary software, I'm actually OK with it. It would be interesting to me to know what criteria it used to determine when to do it, though.

I have a ATI 9200 in my Ubuntu box, it originally was installed using the OSS Driver and mesa 3d, I had to install the drivers myself. This is using Dapper. It's been that way on every box I have installed, probably about 50 in total, so I don't know why yours would be like that, but I think it may be the exception, not the rule

Here is an example package, it's got all the non-free binary drivers in it and it's in the restricted section. It's not installed by default. I promise you that by default Ubuntu does not configure nVidia cards using the "nvidia" driver.

Although Ubuntu gives you the option of non-free binary drivers it doesn't force them on you.

Ubuntu is a bit desperate to attract users by adopting binary drivers while the two big companies Novell and Redhat are trying their best to force companies to release open-source drivers by not including binary drivers in their products. Yes, it will benefit the end users for a short term (this is the main reason why Ubuntu is popular), but it is damaging the OSS.

Umm, excuse me? I donít mean to be hostile, but I think youíve got this back-asswards.

*Microsoft* are the ones who are trying to push *one* piece of software on the user. Windows (from one supplier), IE (which canít be uninstalled), MS Office (which comes free on most PCís, and if it doesnít thereís always MS Works - yes, thatís right, ANOTHER MS office suite/program. I havenít yet seen major PC vendors preinstalling OO.org, though they may indeed do this.)

By contrast, all Ubuntu does is provide a default set of options, and in the Debian tradition, also provides alternatives. Yes, you can install Kubuntu or Xubuntu, but you can also install Ubuntu, and then install KDE and/or XFCE, and/or deinstall GNOME.

I may be in a minority, but I also think Web applications are a bad idea if they are used for mission-critical tasks like office software and databases. I want control of my data, and youíre going to have to pry it from my cold dead hands.

Timeless - if you run Linux or Unix as a ďnormalĒ user (and most do), apps cannot get ahold of the root account without user intervention. OTOH, since Win has so many security holes AND most users use an administrator account, there is nothing stopping spyware from installing itself, or viruses from corrupting the whole system.

Also, users on corporate desktops (whether using Linux, UNIX, or Windows) should NOT be installing apps willy-nilly. Either administrators can do it for them, or they should have enough confidence in those users who do it to know that those users can be trusted to handle the root account responsibly.

"This attitude is not unique to Ubuntu. Although this essay is inspired by an Ubuntu keynote, it applies to all the Linux distributions. Ubuntu should not be marked for special criticism, except that I hoped their focus on users and usability would lead to better appreciation of and support for user-installed software."

He only uses Ubuntu as one exapmle as it is popular and because Mr. Shuttleworth made the comment that led to his blog-entry.

He speaks of a "general problem" he sees in many Linux distros. So please stop this "I cannot recommend Ubuntu anymore", "I never trusted M.S." and "He obviously never used Ubuntu, it's so cool" stuff. Some of you completely missed the point what his story is all about. Thus: please read it again instead of posting nonsense.

The thing is that what he says is literally completely untrue of Ubuntu specifically.

It is still true to a greater extent on Fedora (which I only recently switched from) and Suse (which I recently tried release 10 of). I can't comment on other distros.

With Ubuntu a properly constructed package by a 3rd party (in as much as it's dependencies are in the Ubuntu repos, or included in the package) can be installed by double clicking on it. All deps will be handled automatically.

The reason people are getting pissed off is because the guy is talking out his arse...

The distros have to pick a set of software and make it work. That's what they do. They have to because a lot of software needs a little work to be usable. If distros didn't pick a set of software and add a little polish, things would be much worst. The distros do the dirty work that volunteer programmers don't want to do, and they can
't do that for every project.

How do the mainstream packages get mainstream? Do they start out at day 1 that way?

No, users find them, download them, install them, and like them. Occasionally they talk about them, and eventually it becomes the status quo application in that arena. Often, this is because it's the only application in that arena that's in active development.

If only the most technical users are doing this only the most technical applications will end up available in the repositories.
One of Ubuntu's best qualities seems to be that its users find small projects and use them. They contribute decent bug reports, and are probably the nicest users btw (on average).

Ubuntu users aren't existing on the repo's...

What Mark dreams of, and what Ubuntu is actually doing seem to be radically different. Or maybe it's just been distorted in the grapevine?

Ya know, the compatibility issues really aren't bad if you:
1.) Don't use C++, it's a horribly broken language to start with and to complain that g++ is a bad implementation is to admit not knowing that _there aren't any correct C++ implementations_
2.) Learn how to package things in Unix systems.
3.) Pick something like autopackage for your binaries, it installs the installer systems from the package you create (very nice).
4.) Put up instructions on your site on how to run the three or fewer commands necessary to install your application (software which only admins install, like Apache, doesn't need to be so limited).

Frankly, if you're not following these instructions (with some modifications for platform) on any platform you're doing things wrong: And I bet you'll be whining when Vista comes along and your application gets broken because it was dependant on Admin functionality.
I won't cry for you; I'll cry for the users you duped.

And in the end, binary compatibility is the burden of those who won't share their source freely. What'd really be nice would be source based install systems (not Portage or Ports, one that's nice to use (has a GUI frontend that works well) and doesn't eat 80MB of disk space).

I seem to recall, not too long ago, people complained about distros installing too many applications to do the same thing- install Kubuntu and you have both Firefox and Konqueror, for example... and KDE comes with Kate, Kedit and Kwrite...

After all the screaming about too many text editors, too many DEs, too many music players, now someone comes along and attacks Linux for LACK of CHOICE?

Distro's ship everything plus the kitchen sink and it's bad for business. But it's not because of some paradigm shift, rather the same basic problem Linux has always had : methods of installing third party software on Linux is broken.

Those who dare and venture out of the realm of packages blessed by their distro for that particular release of their OS find themselves in the quagmire of compiling from source, dependancy hell, incompatibility, etc which dwarf any problems on other platforms.

Sure there is hope. Klik is nice for example, but untill all major Linux distros get together on an easy way for independants to distribute their software to Linux users they will never get the mainstream acceptance they seek.

Edit: I agree with the author Mac-style app bundles would rock on Linux, unfortunately most oss developers seem to suffer from an extreem version of NIH-syndrome - NIBM (Not Invented By Me).

I don't see how it's "bad for business". And distros ship plenty of third-party stuff with their OSes: VMware is available as an ebuild from Gentoo, for example.

Because it's an extra barrier to entry for independants : costs rise per distro you choose to support. Alternatively you could rely on the distro itself to package as you point out, leaving you at the whim of a third party. Perhaps they prefer another package and work on that while delaying the release of your software (eg. a build of bochs available before vmware). Either way for truely independant software companies it is bad news.

As for MacOS-style app-bundles, don't they include everything the author put in to the app, even libraries which might otherwise be available? Who wants six copies of the same library?

Well on OsX you can be certain you have a broad set of basic software available, so it's not as much of a problem as you think. Not so on Linux even with the LSB, which is practically useless. Installed libraries and versions vary wildly.
And what if the user updated some library because another package wouldn't install otherwise and now your package won't install (aka welcome to dependancy hell) ? Having multiple version of the same library installed is a frequent occurance even on Linux.

Software repositories are a hack to work around one of the central problems in Linux and they leave independants in the cold.

I've only ever had dependency problems with rpm-based distros. I'm not too hot on them for that reason. I do use and like SuSE, but I keep to the software they have available. I've only got 80Gb on the disk I use in that machine anyway.

Sorry, but I don't accept that "for truly independent software companies it is bad news". If you work on Windows you are at the mercy of Microsoft. Apple may be better than this, but considering they control their architecture even more tightly than MS, I doubt it. The number of software companies that have been double-crossed by MS is legion. Allowing them to get into software like office suites and so on when they already have a monopoly on OSes was a big mistake, in my opinion.

How so? The evil MS will post factum change code in Windows XP, 2000 and 98SE specifically so that your desktop app doesn't work? :-) No, that's not the message they want to send about their OS. On the contrary, they encourage development and proliferation of 3rd party software and make sure it works. And supporting all even relatively modern Linux distro versions out there is surely ten times more difficult that supporting the three or four major versions of Windows.

Lotus, Corel, the makers of Wordstar, and many many others I could list, on top of (no doubt) many I don't know about. They have all been shafted as Microsoft extended its OS monopoly into more and more apps.

Because it's an extra barrier to entry for independants : costs rise per distro you choose to support. Alternatively you could rely on the distro itself to package as you point out, leaving you at the whim of a third party. Perhaps they prefer another package and work on that while delaying the release of your software. Either way for truely independant software companies it is bad news

Either way it is bad news for your credibility, as what you describe poses no problem to ISV on Unix platforms.
But strangely enough, it is a problem on Linux.

Well on OsX you can be certain you have a broad set of basic software available, so it's not as much of a problem as you think

It's still a big problem. I'd say Linux has even more basic and not basic software, with 15000+ software in repositories.

Not so on Linux even with the LSB, which is practically useless. Installed libraries and versions vary wildly

Which is not a problem at all, as major versions don't vary wildly.

And what if the user updated some library because another package wouldn't install otherwise and now your package won't install (aka welcome to dependancy hell) ? Having multiple version of the same library installed is a frequent occurance even on Linux

The second sentence is the answer to the first question. That's multiple major versions of course. Actually, it's not frequent at all.
Except if you use lots of proprietary stuff, which are never as well supported as FOSS stuff, and always fall behind.
When Nero got out on Linux, it was already worse than nearly anything on my PC (especially K3B). I tried to install it this year to see the difference : it was a pain, I needed very old libraries, and when I made it work, it didn't even detect anything. K3B is far better since then.
Most of the time, the problem with ISV software on Linux is that it's not supported after launch. So stop the BS about problems of installation, this is not the truth at all.

Software repositories are a hack to work around one of the central problems in Linux and they leave independants in the cold

That's BS. Independants are those that leave Linux in the cold. Look at NX software for an example of a good software vendor on Linux.
Now look at Nero, or even Macromedia for very lousy ones. Those are the most common ones. Even Real didn't work without an old gcc (3.3) library (I didn't install the latest version yet, so I don't know if it's still the case).
Meanwhile, my Loki games still work and install without problem (most use OSS instead of ALSA for sound, but it's no problem).

That's the main problem. All kinds of sotwares need active and laborous maintinence to be usable on linux few months from introduction. Keep in mind that a simple recompile in slightly different build environment completely voids all previous testing sessions. Paired with still miniscule linux desktop market share and an indifferent (at best) attitude from system vendors it reinforces the chicken and egg effect. The one that makes MS sleep well wrt client stuff.

Distro's ship everything plus the kitchen sink and it's bad for business. But it's not because of some paradigm shift, rather the same basic problem Linux has always had : methods of installing third party software on Linux is broken

No, they are not broken. I still can install these old Loki games (using the Loki installer) on my latest Linux install, and guess what, they even work !
Packages work too.

Those who dare and venture out of the realm of packages blessed by their distro for that particular release of their OS find themselves in the quagmire of compiling from source, dependancy hell, incompatibility, etc which dwarf any problems on other platforms

So you admit that the OS vendor's repository brings stability and support.
If you use another vendor's packages (no need to compile anything, dependancy hell and incompatibility depends on the package), of course, you can't blame your distro vendor, nor the repository.
What you say is pure BS. At worst, even if your ISV package causes dependancy hell, it will still be better than in Windows for example, as at least you will be able to uninstall the package correctly.

Sure there is hope. Klik is nice for example, but untill all major Linux distros get together on an easy way for independants to distribute their software to Linux users they will never get the mainstream acceptance they seek

And yet, Google and Adobe, even Netscape, manage to make software install on Linux. Those that use Loki installer too.

Edit: I agree with the author Mac-style app bundles would rock on Linux, unfortunately most oss developers seem to suffer from an extreem version of NIH-syndrome - NIBM (Not Invented By Me)

I rather think that people like you complain about things they are clueless about. Like saying Mac-style bundles is a every good thing that has no problems, or saying it would rock on Linux. No it would not rock at all, especially when every app has to load the very same code because they duplicate libraries (which defeats their purpose).

"Ubuntu isn't trying to be a platform for mass-market application software: it is trying to be the primary provider of both the operating system and all the application software that a typical user would want to run on his machine. Most Linux distributions are like this, and I think it is a dangerous trend that will stifle innovation and usability."

I just had to say that the base premise is just wrong here. Why? Because any company that wants to get into the software support industry (better if it's an OEM) can just take Ubuntu (or any other finished or unfinished distro) and replace the apps they don't like with their own, call it something else, and compete on the merits of their configuration and Brands and support.

BTW (OT rant warning), brands shouldn't necessarily be free - remember, OSS is about Software (that the second "S" in OSS), not marketing or branding. If Mozilla Foundation wants their logo and web browser name brand to come as a package, then so be it. Ubuntu and other distros should frankly take advantage of the high visibility and recognizability of the Firefox brand anyway. It's really a no brainer.

I like Ubuntu, but i agree with the articles sentiments. I dont really see it as wholly new to Ubuntu, i mean apple and microsoft do the same except the nature of their environment is it makes more sense to sell the apps individually. But they still want all their stuff on your system ... IE, Office and so on.

In the linux world it really comes down to reliance on repositories, some see it as great, but i see it as like some kind of nanny state and insular. At the same time its pragmatic, but still not the best situation imho.

Thats why i see the need for a generic package as the most important thing to popular linux adoption.

Well Ubuntu is NOT an OS in the strictest meaning of the word. As a matter of fact Ubuntu is less of an OS than Windows is but still for the layman it IS an OS. Anyway I could really care less about what Ubuntu intends to do with the packages that they distribute. Those guys have to reorder their priorities. For the 2+ years that the Distro has been around there have been very slow improvements in the general appearance of the OS and anybody who has ever tried a linux distro knows that no DE is on par with what comes out of the camps of Apple and even MS. Not that KDE and Gnome are bad but they still need some polishing in order too appeal to the corporate and the home user markets. However, with an exception of the first 6 months of its existence, Ubuntu hasn't really done much work. When you put this into perspective with what both Novell and the Fedora teams are doing, the only thing that Ubuntu has going for them is the debian repository and it isn't even their idea or even any noticeable improvement over what is already in debian. So would Ubuntu ever achieve their goals of world linux domination? I highly doubt that with the current business strategy. Yes, Suse's repository and community is quite there yet and yes RH and Fedora could drive you crazy installing certain applications without recompiling but at the end of the day they all work and unlike Ubuntu both Novell and RH has a solid industry reputation that eventually results in wider acceptance in the corporate world.

As a matter of fact Ubuntu is less of an OS than Windows is but still for the layman it IS an OS

Which is pure nonsense as it's more than an OS, more than what Windows is. Logic problem ?

For the 2+ years that the Distro has been around there have been very slow improvements in the general appearance of the OS

Which means its look is stable and you recognise Ubuntu easily ?

anybody who has ever tried a linux distro knows that no DE is on par with what comes out of the camps of Apple and even MS

I agree entirely. I know my Linux DE are far better.

Not that KDE and Gnome are bad but they still need some polishing in order too appeal to the corporate and the home user markets

Nonsense. BTW, call me back when Windows give me back my session like I left it when I logged off, like Gnome and KDE (and XFCE) do for me since a looooong time.
Come on, it's a basic feature ! I would dare say it's a must have feature.

However, with an exception of the first 6 months of its existence, Ubuntu hasn't really done much work

Let me guess : you never came close to the things called Internationalisation and Localisation, did you ?
Or perhaps you think everyone is a native english speaker ?

I happen to disagree with Mark Shuttleworth and agree with this guy. Software repositories do not scale, and what happens when the desktop becomes more popular and people want to install all kinds of apps? Ubuntu barely has the resources to package up a lot of software now, and software like Bacula has been behind the times for a very long time. If you want a latest version then you have no option but to get unofficial packages of compile it yourself. Not exactly easy installation. With Winbacula all I do is install, and many configuration procedures can be automated as part of the installed.

The application developers are in the best position to know when their software is final and works, and if you can provide a good infrastructure in your operating system for making testing, packaging and installing painless, then so much the better.

Users being able to install the up to date software that they want is a problem to be solved, not avoided.

Software repositories scale well enough thanks.
What doesn't scale is updates, given what you say. Try to at least understand what you're saying.
When the desktop becomes more popular, the software authors mae packages for them automatically (if they can, I don't think xvid authors can without risking lawsuits for example).

If you want a latest version then you have no option but to get unofficial packages of compile it yourself. Not exactly easy installation

True enough.
Not exactly the distro's fault. You could complain to the distro or to the software maker.

With Winbacula all I do is install, and many configuration procedures can be automated as part of the installed

So obviously, it's entirely the Bacula authors fault, not the distro's fault at all.

The application developers are in the best position to know when their software is final and works, and if you can provide a good infrastructure in your operating system for making testing, packaging and installing painless, then so much the better

So why they don't do that ? And why do you blame the distro ?

Users being able to install the up to date software that they want is a problem to be solved, not avoided

No, they don't. Even if a new piece of software comes out, the only way you'll be able to get it is when the packagers get around to packaging up a version for specific version of their distribution. At the current pace, that's some time never. The time it's taken to update Bacula is a perfect example there which proves just how wrong you are. It's extra work which just isn't necessary.

What does a software developer distributing a proprietary piece of software like AutoCAD (one can dream) do? Package it up for a dozen different repositories and distributions, submit it and watch while it take six months or more to get it through testing and into the official repository? Dream on.

What doesn't scale is updates, given what you say.

Errrrr. No.

Try to at least understand what you're saying.

Well, it seems you're trying to find a way of saying that a repositories system is the best way of installing the millions of pieces of software out there. You're failing - badly - as everyone else has who's done it.

Not exactly the distro's fault. You could complain to the distro or to the software maker.

No, it is the distro's fault. There are no mechanisms and tools in place for creating a universal package that can be distributed and installed.

It is condascending in the extreme that you, or Mark Shuttleworth, should suggest that the software maker is going package up for every distribution and repository out there, or package only for Ubuntu. Hint: They aren't doing it, and they're never going to.

So obviously, it's entirely the Bacula authors fault, not the distro's fault at all.

No, it's the distro makers fault. Windows provides the mechanisms needed for installing and configuring a piece of software in rational manner. Ubuntu and Linux distributions don't do that, and there is no way people packaging up software are going to be able to put in the work needed to package up, what is really the same software, for umpteen different distributions and then watch it go through a repository system.

Not...going...to...happen, and isn't.

So why they don't do that ? And why do you blame the distro ?

Because providing tools that allow you to package up software, and install and configure it in a rational manner is up to the distributor and operating system.

So why don't you start complaining to Bacula authors ?

Because it's not their fault.

It's so straightforward it's unreal, but I suppose there seem to be a lot of people out there who want to defend software repositories as the second coming of software installation. Deluded souls.....

No, they don't. Even if a new piece of software comes out, the only way you'll be able to get it is when the packagers get around to packaging up a version for specific version of their distribution. At the current pace, that's some time never. The time it's taken to update Bacula is a perfect example there which proves just how wrong you are. It's extra work which just isn't necessary

Excuse me, but when did you prove it doesn't scale ? Like I said, the repository will accept your package just fine. Scale does not mean "magic", in the way that every app that appears somewhere is automatically updated in the repository. You're talking about updates to the repository, not the scaling of the repository.
If you complain to Bacula and the repository managers, you will get the package in faster.
The fact that Bacula wasn't updated in the repository doesn't show anything about if it scales or not. I could give you the big ones (KDE and Gnome) which are updated nearly instantly, because you see, KDE and Gnome devs make the effort to provide their product to distro. So it scales pretty well, shows that you're wrong, and that Bacula is the problem.

What does a software developer distributing a proprietary piece of software like AutoCAD (one can dream) do? Package it up for a dozen different repositories and distributions, submit it and watch while it take six months or more to get it through testing and into the official repository? Dream on

KDE and Gnome manage to post their huge software base to the most popular distro, and they're not even as cohesive as AutoCAD vendor.
What BS is it you're saying ? You're saying professionals like AutoCAD makers can't provide their products to customers, while Gnome and KDE devs can ?
Come on !
Popular FOSS programs are updated in less than a week, how come ISV can't do the same ?

Well, it seems you're trying to find a way of saying that a repositories system is the best way of installing the millions of pieces of software out there. You're failing - badly - as everyone else has who's done it

Perhaps it's not the best way, but it works, leaving your OS consistent and working.

No, it is the distro's fault. There are no mechanisms and tools in place for creating a universal package that can be distributed and installed

BS ! There are for EVERY distro out there. If you mean sth installable on any OS, there is no software like that on any OS, sorry.

It is condascending in the extreme that you, or Mark Shuttleworth, should suggest that the software maker is going package up for every distribution and repository out there, or package only for Ubuntu. Hint: They aren't doing it, and they're never going to

Some amateurs and professionals have done it before, though.
Nobody asked for that, but you have problem understanding that that's only YOU that say that. You can't seem to get off your own straw man.
You need only support one or two most of the popular distro package system.

No, it's the distro makers fault. Windows provides the mechanisms needed for installing and configuring a piece of software in rational manner

That's completely false. No Windows app can install itself without the vendor packaging the software, stop your BS please.
There are even several different package makers (4 or 5) for Windows. It's no different than a distro package. They're even worse.
If you look at a Windows installer, no one looks the same, and some are even completely custom (like for games or drivers).
Obviously, you never made a package for Windows or for Linux distro. You seem really clueless on the matter.

Ubuntu and Linux distributions don't do that, and there is no way people packaging up software are going to be able to put in the work needed to package up, what is really the same software, for umpteen different distributions and then watch it go through a repository system

You don't even know what you're talking about. A package manager is exactly what you're describing. Apparently, you don't understand the two words package and manager. Package manager go hand in hand with repositories, but you're not forced to put your app in a repository. A package is a standalone equivalent of your setup.exe in Windows. In fact, it's FAR BETTER, dealing with dependancies if you wish, having automatic installation/uninstallation that actually works, pre and post installation actions, ...
Again :
- there is no need to package for umpteen different distro
- you need to support 2 package systems at most
- you can use the load of installers available (Loki being the oldest one, autopackage, ...)

Because providing tools that allow you to package up software, and install and configure it in a rational manner is up to the distributor and operating system

Already done : rpm, deb, there are others too. It's tiring already.

It's so straightforward it's unreal, but I suppose there seem to be a lot of people out there who want to defend software repositories as the second coming of software installation. Deluded souls

Perhaps against people like you that don't even know what is available and spread lies.

Excuse me, but when did you prove it doesn't scale ? Like I said, the repository will accept your package just fine.

It doesn't scale because it's entirely dependant on the developer resources available to package up for each repository and each distribution, whereas if you just provide a generic package it's done for everyone.

I could give you the big ones (KDE and Gnome) which are updated nearly instantly, because you see, KDE and Gnome devs make the effort to provide their product to distro.

You really have no clue at all. Gnome and KDE developers do no packaging whatsoever for distros. Canonical, Suse and other distros have entire teams of people that package up Gnome and KDE for their respective distros and repositories.

What are Canonical and all these other distros going to do? Employ teams of people to package up specific fringe, but popular, software and maintain it? Are they going to package up all the potential properietary software as well? It's just not viable, or scalable.

BS ! There are for EVERY distro out there.

So if I create a package for Fedora it will install on Ubuntu and Suse? Errrr. No.

You need only support one or two most of the popular distro package system.

Well no. You don't just port to a package management system, but you're also porting to different distributions, or even different versions of the same distribution.

If that's the case then they're going to port to Suse and Red Hat and leave Ubuntu out in the cold, aren't they?

There are even several different package makers (4 or 5) for Windows. It's no different than a distro package.

Except that you can create a package quickly and easily for Windows, with a GUI front-end for configuration, and it will install. When there are four or five package managers, and distributions, then you have to port to those four or five.

If you look at a Windows installer, no one looks the same, and some are even completely custom

So what? It installs.

You don't even know what you're talking about. A package manager is exactly what you're describing. Apparently, you don't understand the two words package and manager. Package manager go hand in hand with repositories...

Except that has nothing to do with what you replied to, namely that application developers are not going to package their software up for umpteen different distros, package management systems and repositories.

there is no need to package for umpteen different distro

Oh, so you can take a package designed for Red Hat and install on Suse or Ubuntu and all these problems don't exist? Errrr, no you can't.

you need to support 2 package systems at most

Suse's RPMs are different to Red Hat's RPMs which are different to Debian's DEBs that are different to Ubuntu's DEBs.

Plus, you can't wrap your head around that you're not just supporting different package systems, but making packages for different distros and different versions of the same distro with different binary compatibility requirements and different dependencies.

Why on Earth do you think third parties create statically linked installation packages that are the size of Brazil?

It's a mess.

you can use the load of installers available (Loki being the oldest one, autopackage, ...)

There isn't centralised support for those within any distributions, which is kind of Mark Shuttleworth's point. He doesn't want anything to do with them.

Already done : rpm, deb, there are others too. It's tiring already.

Errrr, that's kind of the point. There are too many different package management systems, and even the RPM and DEB systems are different between Suse, Red Hat, Debian and Ubuntu. What is required is for distributors to run with something like Autopackage for anything beyond the core software of their system, so that developers can make one package and have it available for everyone.

Perhaps against people like you that don't even know what is available and spread lies.

Well, the latest version of Bacula certainly isn't available for Dapper, and that isn't a lie. The application developers just don't have the resources, and telling them to port to Ubuntu and every other distro out there isn't going to help.

Slackware, Debian, Redhat Suse and Mandrake have contributed to linux community, by reshaping how linux system works. They can be called operating system.
What is Ubuntus (or knoppix, DSL PCLOS etc etc) contribution to linux community? none compared to slack or RH or debian. has Ubuntu ever developed a single software?? NO. have they any special programs for HW detection? NO. it is just better configured linux distro.... thats it.

"What is Ubuntus (or knoppix, DSL PCLOS etc etc) contribution to linux community? none compared to slack or RH or debian. has Ubuntu ever developed a single software?? NO. have they any special programs for HW detection?"

Uh, they are developing Upstart right now, which is for system startup and dynamic device detection and initialization.