A long, long time ago, packaging was an exciting idea. There were disputes over style and process, there was innovation. There were reasons to prefer .deb over .rpm over emerge and its binary packages…

Today, these differences are just a hindrance. The fact that there are so many divergent packaging systems in the free software world (and I include the various *bsd’s) is a waste of time and energy. We want to focus the collective brainpower of the community on features and bugs, not on packaging. I would like to see the LSB renamed to the FSB – the “Free Software Base”, and get buy-in from the *BSD’s, and then I’d like to see us define distribution-neutral packaging that suits both the source-heads and the distro-heads. Then there’d be sufficient rationale for the relevant upstreams to include that packaging framework in their revision control repositories, and distro patches would become far more exchangeable.

Ubuntu isn’t built on secret sauce in the packaging. We don’t think our patches should be hoarded – we mail them all to Debian daily anyway, and publish them as best we can on the web the instant they are uploaded, often before.

Packaging is also one area where we can definitively improve on the real user experience for most people who treat computers as a job not a passion. It’s a strategic tool in the battle between proprietary and open approaches. I often think that the proprietary software world’s way of distributing software is one of its biggest weaknesses – an Achilles Heel that we should be exploiting to the full extent possible. I’m often asked why Linux can’t make it easy to “write something like Microsoft Installer, or Installshield”. That’s the wrong rabbithole, Alice. Linux can make it so when you dream of databases, PostgreSQL or MySQL are “just there” and “just work”. That’s a much nicer experience – we should make the most of it.

This entry was posted
on Wednesday, November 1st, 2006 at 10:00 am and is filed under free software.
You can follow any responses to this entry through the
RSS 2.0 feed.
Both comments and pings are currently closed.

130 Responses to “#12: Consistent Packaging”

People are to haste to install packages that are not always stable. And the worst thing is, the folk that write the packages will tell you the paticular item is stable or not so stable. People need to stick to the stable versions, especially when do a complete conversion from windows to linux. My department wanted to switch everything over to ubuntu and had decided to go with edgy…….I told the sysadmin this was not such a good idea and edgy is call edgy for a reason. Same thing happen when everyone started to switch over to dapper. They all expected stability in the beta times and would come on the boards pissed that particular packages were not working.

I am inclined to agree with you. A friend of mine at work knows that I do a podcast about ubuntu and I tout open source operating systems all the time. He is of course a Microsoft user and he states that he always will be. When asked why he simply states that it is difficult to install software on to linux based distros. I get the usual statements like “I have to also download all the dependencies” or “why cant is be like windows with one click intallations?”. Yes, I have told him about synaptics package manager and tools like it, but he remains unconvinced. Standards in packaging will help users of all operating systems see that linux based distros are not just a bunch of wayward revolutionists.

Mark, you know what?
I am really enjoying your blog space – I have been learning alot of new things from what you and others post! If I am learning then I am growing and thats all the good stuff!
So actually what I am saying is that I really appreciate it!!

[…] A long, long time ago, packaging was an exciting idea. There were disputes over style and process, there was innovation. There were reasons to prefer .deb over .rpm over emerge and it ’s binary packages… Today, these differences are just a hindrance.read more | digg story Related Articles: Interview: Malcolm Yates of Canonical (Ubuntu)Canonical is the company behind Ubuntu: one of the fastest-growing Linux distros on the market today, and certainly one of…Shuttleworth Venture Capital Invests In South AfricaShuttleworth is investing money in South Africa. Perhaps if this venture is successful it will convince other VC firms Africa…Why Linux Isn’t Mainstream (Yet)With the ease of installation, maintenance, and use of many recent Linux distributions, such as Ubuntu and Fedora, some are…Linux Computer About the Size of a French FryThe GumStix NetStix 400xm-cf is Linux based and has a 400MHz Intel processor coupled with 64MB of RAM. And that…Teeny Linux PCs proliferateA small company has begun building its line of tiny, gumstick-sized board-level computers into miniscule packaged PCs that displace around… […]

I’ve been thinking about this problem all day. I think that we could create a Universal Package somewhat similar to Apple’s Universal Binary.

Essentially: Build a version of the package for each system (distribution/version combination) you want to support. Go into these native packages and remove any compression (like the two files in a deb). Designate one of the native packages as the “base version” and take binary diffs between it and each of the others. Now package up the base version and all the diffs into a .up file, compress it, and ship it out.

To install a Universal Package, you decompress it, extract the base version and the diff for your system, apply the latter to the former, re-apply whatever compression you removed to create the diffs, and voila–you have a native package appropriate for your OS. Just install it.

I think this will produce fairly reasonable sizes because a lot of the stuff in a package doesn’t really vary between distributions. Resources and documentation may change locations, but their contents are pretty much unchanged; heavily algorithmic code shouldn’t require much alteration either. And really, how big are the differences between a Debian and an Ubuntu binary, especially considering that so many Debian binaries run perfectly on Ubuntu?

Incidentally, you could use this to create multi-platform packages, although the diffs would be much larger in such cases. (You might want to have a sort of multi-level system with this where amd64-ubuntu-6.10 packages are created by starting with the i386-debian-sarge base, applying the amd64-debian-sarge diff, and then applying the amd64-ubuntu-6.10 diff.)

(Hmm…it’s a shame that Summer of Code is now six months away–this might make a good project for it…)

The one thing about a “universal” repository: Dependencies will have to be “real” and not just set to the latest versions of a package. This is actually a problem with Ubuntu as well.

When running Dapper, I saw a package or two in Edgy that I wanted to install. I downloaded the .debs from the edgy repository and tried installing them in Dapper. It said that I didn’t meet a lot of dependencies that were just version bumps in packages that I already had installed. Now, if those new package versions were absolutely necessary, that’s one thing, but I could have just compiled from source the package and installed it on my system, suggesting that it’s not a new version I needed at all.

[…] Installation and removal of software: After a couple of months of using Ubuntu I am still not entirely sure how exactly software gets installed. Using Synapsis is by far the easiest way I have ever installed anything but what if what I want isn’t available through Synapsis? Same goes for removing something. Just way too much hassle. Example: Blog entry about removing Amarok. Mark Shuttleworth is apperently also not blind to some of the faults as his post Consistent Packaging suggests. […]

THis is a good question… But the answer every one know… We has to “standarize” the packaging system urgently!!! Too many peoples here have talk about unresolved dependences, Windows and MAC way to install an application… Thats is the idea!!! We have to talk with all of distros “boos” and take a consense…

Linux is a good tool for me, that know linux system and use it about 4 years ago… But some “new users” dont wants to know what´s debian package or a RPM package… I realy discard the option to compile applications… The linux will afront Microsoft only when it became more easy to use an aplication, like this application and could show it to a friend that use Linux (this linux should not be the same, right??) It is the WHISH of an user…

A great mirror with packages its ridiculous… The internet its a good way to install and share application, i know… But, it spend time to download them before install them… It should be better if you hava an “installer” on a PenDrive, (DV/C)D(R/rw…), floppy (rsrs… slowly as download from internet… srsrs..), (…). I wanna this… MAny peoples here wants this… And any more at world wants this…

Use GNome, Kde, Xfce, EDe? it does not matters… Some peoples has differents preferences to do/view/organize your thing, documents (…). The important one is that any one on linux could do the same thing on linux as easy as on a Windows or MAC system…

Am I a dreamer?!!?? I think almost you´ve been said are right, Mark! THe Ubuntu has grown becouse of your “freedom thing way”!

It might be a good idea to unify the package formats (which are an essential part of most Linux distros). But, regarding the comments, here are some things that you might want to consider before developing the next great package format 🙂

– though Desktop users may want to have the latest bleeding-edge packages, companies (esp. software developer companies) want a stable version (like Dapper). Commercial Linux software is often built for a specific combination of commercial third-party libraries, commercial in-house-developed libs, and the distro (most customers just say: “it has to run on RHEL4”, so it is built and tested only on RHEL4)
– a distro is more than just lots of independent packages; even though the binary compatibility might be given, it is also important that the packages work together for things like start menu entries, not overwriting their files, interprocess communication…
– take into account that many sysadmins not only use rpm/apt for package management, but also build their own packages (at least at our company they do 🙂 and so it’s not the simple solution to “make apt/rpm/yum just a frontend to the new unified package system”; you also need to provide ways to convert old packages and old knowledge to the new system.
– if you use software packages like Apples .dmg, you have many versions of the same library on your system. In Ubuntu, a security hole in imlib means that I have to update _one_ package; under Windows and Apple, the libs are strewn all over the disk (in .dmgs or in program folders). Updating these many lib versions is difficult.
– for the people favouring Apple’s easy software installation (“just one package for 10.4, and one for 10.3 – where’s the problem?”): Have a look at Opera’s Linux packages! There are a lot more Linux distros than MacOsX versions – so there are a lot more Opera packages for Linux; but indeed they offer packages for really many distros (you just select Debian -> Sarge -> Download and get a .deb for Sarge; or Ubuntu -> Edgy -> Download, and get a .deb for Edgy; same for .rpm and many other formats). Btw. there’s a repo at deb.opera.com, so it nicely integrates into Synaptic etc.
– basically, if you don’t want to provide so many different packages, you have to reduce the number of different distros 🙂 so that you end up with “Linux 2007”, which is then in line with “Windows 2007” and “Mac OS X 2007”. And then you’d have your unified Linux, and no more choice… Ouch…
– For satisfying bleeding-edge-users, it might be nice to have easily installable source packages, so that you can use the latest source release and install it as easy as a .deb. Not sure, but doesn’t Gentoo do it like this? Also, if you want latest version, just use “Ubuntu Unstable” (ie. the development version), but don’t complain that it’s unstable. If you manage to build a stable (!), bug-free distro with 8-hours-from-CVS-to-Desktop release speed, you can apply for Nobel price for Computer Science 😉
– before I forget it: InstallShield etc. suck big, compared to .deb and .rpm 😀
– a question for all Linux-Fans here: what’s your goal when you promote Linux to friends/relatives/co-workers? What do you hope for with that? IMHO the great thing about Linux is the freedom to hack it, to combine different parts, and generally not being hindered by the rigid framework that Windows and Mac OS X have. So, I guess even if Linux would succeed to become a system with > 50% desktop market share, it would then have lost its free structure and be Just Another OS.

So, about the packaging: I think it would be nice to have some unification, esp. if it could mean that upstream devs would do the packaging (for _one_ format). But the current system _does_ have its advantages, and it would be important to preserve these advantages in a new system.

I had a Citroen GS car many years ago. Had all kinds of groundbreaking technology including hydropneumatic suspension and great aerodynamics. It was soon evident that the designers had done a great job on all the difficult stuff, but the more mundane things had not troubled the attention of the designers. As a result, some routine maintenance jobs were near impossible. “A great car for the skilled mechanic” would have been an accurate but probably rather unsuccessful advertising slogan.
Cars have moved on from the days when every driver had to be skilled at repairs because they broke down on most trips. Now some cars don’t even let their owners go near the motor, but that doesn’t matter because they don’t need to.
The majority of computer users don’t expect to need to go ‘under the hood’. Windows ‘simplicity’ may be illusory, but the illusion is sustained well enough. Most users expect it to break occasionally, but will then get someone else to fix it.
What this suggests to me is that the first Linux distro to crack the packaging issue will be the one that gets a huge user base. It probably won’t appeal to the mechanics, but for the masses who just want to use computer to get the work done, A to B without dirty hands is the key thing.

The lecturer responds “It is the language we speak now. We have millions of people working on it, to make it the next language. All the world will be speaking it soon”

The young man says “How can I know what you are saying if you don’t tell me in English first?”

The lecturer says “But if I don’t tell you in Dutch how will you learn it?”

But, says the young man. “I am not an idiot with languages; I have been speaking one for the last 20yrs. I only need you to tell me what you mean in mine, so that I can understand the new one”

To which the lecturer says “Turn Left. Go down the hall and take the first door on the right. Go down three flights of stairs and it is the third door on the right”

This is essence is what is holding Linux back. The inability to explain in simple language what things mean. The program is there, competition between programmers is already there. What people need is English teachers. Let me give you an example.

To set up dual monitors on your Windows machine.

On the Desktop (the first screen that opens when you start your computer) you will see an icon called “My Computer”
Right click on the “My Computer” icon and click on Properties.
On the window that now opens you will see (along the top) tabs. Click on the one that is marked Settings. Etc; etc; etc.

Now that is English. Why why why is that so hard? Let’s have some visuals, screenshots or even Flash.

You will get more people to make the transition with good support more than good programs. Strange comment but true. Please promote the people that can teach and not System Engineers who think everyone knows what a .tar is. The program does not have to be simple (Windows proved that) but the understanding has to be.

My rant over, as is my shot at Linux (for now) for the frustration of the week has got to me.

[…] Installation and removal of software: After a couple of months of using Ubuntu I am still not entirely sure how exactly software gets installed. Using Synaptic is by far the easiest way I have ever installed anything but what if what I want isn’t available through Synaptic? Same goes for removing something. Just way too much hassle. Example: Blog entry about removing Amarok. Mark Shuttleworth is apperently also not blind to some of the faults as his post Consistent Packaging suggests. […]

Excellent point Mark. Heres the real question though, what are we going to do about it? Is Ubuntu going to help with autopackage and move in that direction? How do you get hundreds of distros to switch over to one packaging standard? Especially after so much work has gone in to getting their distros fully integrated with their package manager? Take Fedora and yum(pup, pirut, anaconda now using yum). Even Ubuntu and gdebi and synaptic. I think smart is one way to go, if all distros adopt smart then at least we are all resolving dependencies the same way and we are one step closer to the desired goal. At the same time we can also try and get distros to name packages the same way. From there it will just be a matter of merging packaging standards on both rpm and debs. Of course there are flaws to this method, but at this point it is only a suggestion. The important thing is that we do keep pushing towards this goal. I do believe this is one of the main reason many companies stray from releasing Linux versions of their software or drivers or what have you. While this may not be the only reason it is a pretty important factor. Let’s hope this ‘issue’ gets resolved. Mark I wish you the best of luck in this venture. I’ll help if I can.
Good day
AhmedG. (aka Ihavenoname)

I’m sorry to say that, but it’ll never happen. If ubuntu team will make some changes, other distros will ignore it (distros almost always ignore each other). If you took a part from every distro you will able to make ultimate one. Another thing is dependencies. If program requires library X and distro Y has it, but distro Z don’t. Distro Y will probably try to make package without that depencency and everything starts from beggining…

Hi Mark,
I’m a happy Ubuntu user.
I had an idea: hold package standard (deb) but make it distribuited and updated via RSS.
To make it an actual system we need:
– define a standard package fundation, I propose every packet depending by “ubuntu-desktop.deb” et alii
– certificate software by community, to prevent some sicurity trouble avoiding spyware and other malware
– distribute this software with all exotic library, like a metapackage (libraries usually are very light)
– restrict what dpkg could do, it should execute some standard tasks, like create new file and update other and so on, and not modify iptables if it is not what it is created for.
This will work if developers and their community want to provide package directly. This should improve commercial appeal of Ubuntu, because in this way we can give a standard system for commercial software too.
These ara just ideas, excuse if these are not what you are searching for.

[…] This sort of conversation makes me very cynical about the relevance of the OSDL and LSB. It is easy to pick on Ubuntu here but in practice I know this kind of thinking is endemic to the distributions … it would have been exactly the same at a Fedora conference. It was a shame that Mark Shuttleworth didn’t show up – his blog entry is the only reason I went along, as I’ve had similar conversations via email before so knew what to expect. Unfortunately it sounded like he had a lot of pushback on those plans and they are now being watered down to simply being “it’d be nice if source compiled the same on every distro”, which is basically where we are today, sort of, except because there are no ISVs there’s no real incentive to keep it that way and sometimes it breaks. […]

It would be great if the LSB releases every 6 months a new release. Plus defining the FHS more concrete where KDE, Gnome etc. has to be installed. With that combination distributors would be able to release a new distri every 6 months under a new LSB version. So you only need LSB compatible packages for the different LSB versions which run on all distributions which are compatible with the different LSB versions:

That would decrease number of packages needed for the different distributions. If they would use only deb or rpm format there would be only 3 different packages for a period of time of 1,5 years which would run on all LSB compatible Linux distributions.

Thanks to Mark for opening up this discussion, and thanks to Steve for the link to http://zero-install.sourceforge.net/ – I will be installing this on my Ubuntu Edgy system very soon and doing my best to contribute to the project’s success, because it looks to me as if the concepts behind the zero-install project are the Right Way Forward.

It’s even easier to use than the Windows app installation process, because you don’t need separate download-installer, run-installer and run-application steps; having found a zero-install app on the web, you just run it.

Everything else – downloading the code, downloading dependencies, applying updates – happens automagically. You don’t need admin rights just to install an app, you don’t need to mess with a repository list, and distro developers don’t need to maintain repositories. Software developers are in charge of specifying their own dependencies, and any number of these can coexist, even if they provide similar services; if multiple apps share dependencies, what they have in common only gets downloaded once.

As a site admin, all I’d need to do to set up my workstations would be to provide a base OS with a bunch of links to zero-install apps, and a web proxy on my LAN. What’s not to like?

I use Ubuntu because it’s nicely integrated, and gets more so with each release; every six months, more things Just Work. It strikes me that Ubuntu getting behind the zero-install project and making it the standard way for Ubuntu users to acquire apps may well be a good way to make that experience scale.

As far as I can see there’s nothing stopping Synaptic and zero-install from coexisting in the same distro. As releases goes by, though, I can also see no fundamental reason why zero-install couldn’t progressively take over from apt. Comparison here:

The Ubuntu way to persuade other distros to get on board seems to me to be by doing something compelling enough that what the other guys are doing becomes obviously deficient by comparison. At first glance, zero-install strikes me as exactly that kind of advance.

I am persuring B.Tech in Amity ,but not happy here i am looking for an online computer language test on c,c++ for money factor only thats national or international level that doesn’t matter to me,so i am waiting for ur reply…….

[…] I was skimming through some of the older posts on Mark Shuttleworth’s blog when I came across this one that struck a chord. While I do like the whole Debian apt packaging system, for the most part, […]