Posted
by
timothy
on Wednesday November 24, 2010 @09:00AM
from the gathering-no-moss dept.

formfeed writes "The register claims that 'Ubuntu is moving away from its established six-month-cycle and potentially to a future where software updates land on a daily basis.'
While this sounds like a sudden change, it is apparently more of a long-term thought. The Register quotes Shuttleworth:
'"Today we have a six-month release cycle," Shuttleworth said. "In an internet-oriented world, we need to be able to release something every day. That's an area we will put a lot of work into in the next five years. The small steps we are putting in to the Software Center today, they will go further and faster than people might have envisioned in the past."' But given that many of Shuttleworth's thoughts became decisions later on, it might be interesting to see, where this one leads. Interestingly enough, five years is about the time when Ubuntu will run out of letters."

Right now all new features appear simultaneously when I update the system to the new Ubuntu version. This leads to fear & uncertainty when doing this, and also means I have to learn all the changes at once.

New features would be a lot more interesting than "some programmer added a buffer length check and thus MAY have fixed a security flaw but really nobody has proven that it is possible for an over-length string to get to that point in the code" whi

My update manager refuses to work ever since I updated to 10.10. It either keeps telling me to authenticate or it keeps telling me it is waiting for apt-get to exit. I use apt-get install for every update now until someone can update update manager. It is interesting when I updated linux-image2.6.35-23-generic it told me that a additional 139M bytes of disk would be used and when I updated linux-generic it told me that an addional 169M bytes of disk space would be used so a total of 308M bytes of disk sp

The update manager would bring you new versions of software daily, if they were putting new versions in the repositories daily. In reality, the Ubuntu maintainers only put bugfixes and security updates in the repositories, never new versions of software with new features.

Of course there are benefits of using the software provided by the distribution such as the automatic updates but you are in no way limited to only use that. I'm mainly using CentOS 5 right now and it is far from bleeding-edge. If I want the latest version of something I go onto their website and see if they have a package, if they don't I download the source code and build it myself.

Lately, there has been a gradual shift in Linux hardware support where distros are limiting support for older hardware. I understand why they are doing it, but by doing what Ubuntu is [thinking about] doing, it could literally result in a situation where one day your computer is supported and the next day, it's not. That's not a good thing.

Seriously, running Debian Stable happily for months. They release a break for my Firewire, obviously a security update because it's STABLE, then they release a break for my sound, I didn't feel like futzing with drivers and stuff, that's why I stayed on stable. I went to Kubuntu. 8 months later I figured I would try Debian again. Still broke. I don't know what they were thinking, but stable isn't.

Similar. I'd run Debian testing for years as the "best compromise" but the latest testing has given me so many problems it's more like an old unstable. Moved to 10.04. Finally, an Ubuntu where everything just worked. You can imagine that I don't welcome this news.

Of course. the wise user collects live CDs for recovery/repair/reinstallation and so is more prepared to sort out problems than most Windows victims (who can, BTW, collect PE-ish live CDs for rescue/recovery/emergency internet surfing too).

Yeah, but the cases in which your hardware would no longer be supported, it's going to be on fairly old equipment (5-10years) which, for the most part, people are using as servers, firewalls, routers, etc... I wouldn't be turning on daily updates on a device like that. Worst case, if you built a computer for a relative they use just to check their mail. But again, I wouldn't turn on updates unless I were there to monitor it.

In 5 years this may be a quite reasonable approach. It might not. It all depends.

Actually, it depends on a lot of things. Debian testing is usually quite good, but I generally figure that at least once during every development cycle it will hose my system. It's a rolling release, though possibly a little bit less tested than he's thinking about.

FWIW, I've used Ubuntu, and it's not bad, but now I'm using Debian testing again. (There was a period a couple of months ago when it broke my system. So I figu

You go from one release cycle style to another. Periodic releases to constant releases. And then back.

Each style has its advantages, but in the end you just end up changing for change's sake and no real benefit is gleaned one way or the other. It's a lot like reorganizing resources in a company. You move some people here, you transfer some people there, you change from a horizontal hierarchy to a more vertical one. Then in 18 months you change it back.

In the end, the guys on the ground doing all the nitty gritty work do the same job they've always done and the company keeps chugging along.

That being said, it's usually a case of management losing touch with the guys on the ground that causes this kind of shakeup. I wouldn't be surprised if Shuttleworth is a bit disappointed in how the business is going and is looking to change the sales story for Ubuntu. From the "stable and great" OS it is now to "cutting edge and always up to date" OS it could be with constant drops.

While i understand that you want the foundation to be fairly stable that in itself creates a slew of problems. Foremost that stuff like Firefox, OpenOffice and other userend apps wont get upgraded to newer versions until the next rollover.

My personal dream would be a distribution where the user end is getting upgraded often and fast while stuff under the hood gets overhauled less often.

A suggestion would be major overhauls once every two years of the backend stuff while user applications is kept on newest stable versions. That way developers of backend stuff gets ample time to iron bugs out while users wont have to upgrade the whole desktop just to get a new version of an app.

While i understand that you want the foundation to be fairly stable that in itself creates a slew of problems. Foremost that stuff like Firefox, OpenOffice and other userend apps wont get upgraded to newer versions until the next rollover.

I'm pretty sure I've had Firefox version upgrades in both CentOS and Ubuntu without an OS version change, so that doesn't seem to be a problem.

It used to be a problem. Firefox has a short support window for older versions of Firefox, and Ubuntu developers have had to try to backport fixes to older versions of Firefox rather then do a major version upgrade. They have recently stopped this policy and are upgrading all older distros to the most recent firefox.

I like it better the way it is now. There are ppas out there for most apps if you want newer versions. This gives each user control on what apps they consider non-essential enough to risk an upgrade for.

Yes but PPA's can become a royal PITA when it comes time for the next upgrade. Any system that I've heavily used PPA's on has always have some issues during the upgrade procedure - most I had to reinstall from scrach to work out the quirks. Sticking with all standard repositories has never caused a problem.

For some software like Banshee, I take that risk, but I don't do it for anything but a select few packages.

Personally, I like the "limited rolling" idea. Let Ubuntu release updated versions of the user

So, essentially, moving to the model that Windows and OSX has had for years - a very limited set of 'system' apps that get updates with each OS release, and application software that gets updated whenever. I think that has big advantages.

My personal dream would be a distribution where the user end is getting upgraded often and fast while stuff under the hood gets overhauled less often.

Another thing I would love to see is a distribution where I can decide when I want to upgrade (or downgrade), not some random other guy. Far to often I have run into situations where a new piece of software contained a bug that didn't exist a version earlier, but with standard apt-get there really isn't a proper way to downgrade, let alone installing two different versions of the software side by side.

Using source has a workaround or hunting down for an old.deb package of course helps, but I would much pre

The first thing they teach you on d-l when you complain about a Sid issue is, "did you install apt-listbugs?" If you're not smart enough to read the buglist before you install, you shouldn't run Sid - you should run Stable.

Of course if you're the very first guy to install it you can still hit issues. I keep a local repo of my "last known good" debs so I can rollback if Sid breaks something. The equals key in aptitude is your friend.

The biggest problem I've had with Arch is that changes are only tested on a few common packages before being sent out. If you use an obscure package routinely, it might not be tested decently with any given change, especially to libraries.

Once a change breaks something, you're left trying to install multiple versions, locking versions, modifying the source, or other such deep magic. Very quickly, the whole system gets to be too big a hassle to deal with.

I'm sure I'll get some flack for saying this, but this is one place where I think that gentoo is slightly better than arch: you have more power to (easily) put together a set of packages that works for you.

That's really only the case if you're installing packages from the AUR that are unmaintained or have a lazy maintainer. With the recent python3 switch, everything not in AUR (since that is user maintained), was updated to point to the appropriate python executable.

OR, the issue is that you're performing selective upgrades. Which, in that case, of course you're going to run into library issues. ANY rolling release OS is meant to be fully upgraded whenever an upgrade is performed, otherwise you risk br

Once a change breaks something, you're left trying to install multiple versions, locking versions, modifying the source, or other such deep magic. Very quickly, the whole system gets to be too big a hassle to deal with.

Maybe the answer is something akin to restore points in Windows, noting prior to an upgrade what system files will be overwritten, backing them up (and the dpkg database) and then installing the update. If there is a screwup, then you can rollback to some point in the past.

Well, that's easy. I don't remember the directory path right now (I'm at work), but maybe it's something like/var/cache/pacman/somethingorother, all the old package files are right there. I've had to go back there a couple times, but that's about the extent of the headaches I've ever had.

What is this fascination with change for change sake? What could possibly be so important that it has to come out each and every day? Then to use a silly statement like "in an internet-oriented world" to justify what sounds more like the capricious nature of an marketing executive then solid rational for updating software...meh!

As a recent (4 years) user of Linux I find the current release process more reliable then wondering what will happen the next day. Like another poster, I've had a "stable" release

What is this fascination with change for change sake? What could possibly be so important that it has to come out each and every day?

The problem is that if you wait six months between upgrades then that means you spend 12 hours downloading and installing hundreds of megabytes of changes and then it crashes part-way through and your system is hosed. I've reached the point where I'm reluctant to upgrade any of my Ubuntu machines to a new release because of all the problems I've had in the past.

If they can release the updates in smaller batches which make less changes then that would reduce the odds of a system not working and taking six ho

The problem is that if you wait six months between upgrades then that means you spend 12 hours downloading and installing hundreds of megabytes of changes and then it crashes part-way through and your system is hosed. I've reached the point where I'm reluctant to upgrade any of my Ubuntu machines to a new release because of all the problems I've had in the past.

This is the reason i have switched back most of my desktops to debian. The upgrades of ubuntu are not of very good quality. Have a bridged network interface configured? The upgrade process will run into problems. Just to name one annoying example (in todays world of desktop visualization, bridged network interfaces are not so obscure IMHO and are often automatically configured by your visualization solution).

The problem is that if you wait six months between upgrades then that means you spend 12 hours downloading and installing hundreds of megabytes of changes and then it crashes part-way through and your system is hosed.

It sounds an awful lot like you're installing new versions as in-place "upgrades". I've never had that work successfully, starting from RH 6.something or so around 1999. Your much better bet is to download the ISO, then install the new version in a fresh partition. Mount all your data like normal (you do have your data on a separate partition, no?), then give the new version a spin. If it hoses something, you've still got your old version on its own partition, and switching back is as easy as rebooting.

Keeping things in separate partitions and mounting as appropriate is one of the key advantages (for me, anyway) of Unix-style filesystems. An example partition list:

20GB partition - OS 1

20GB partition - OS 2

20GB partition - OS 3

20GB partition - OS 4

160GB partition - data

Leftovers - swap, etc.

Create and use more or fewer OS partitions as you find useful. I have Windows XP on one (not used on the bare metal since shortly after buying the computer), Ubuntu 9.10 in the next (thinking about wiping this and replacing with 10.10), 10.04 in the third, and I keep the fourth around to play -- check out Fedora, Arch, Mandriva (when they were still viable), etc. In each OS, I just mount my data partition as appropriate -- generally just as/data, and then symlinked from the appropriate/home/[username]/data locations. (You could just keep all/home/[username] directories in your data partition, but I tend to find that this causes config file conflicts, so I just keep the equivalent of "My Documents" in the data partition.)

This way, "upgrading" is as simple as a full install in a fresh partition. This completely avoids the problem you (and I and many others) have run into: wasting time downloading and installing hundreds of megabytes of changes and then it crashes part-way through and your system is hosed.Install after a clean wipe -- avoid that "not quite fresh" feeling!

This is one of the biggest features I miss about Gentoo. I like running the latest and greatest but stablity is also nice. Hopefully this would mean things like same day Firefox stable releases in Ubuntu and others. Not sure how it would handle big things like GNOME, X, or glibc but if they can do it well that would be amazing.

Linux is already much of a moving target when it comes to application development and getting some kind of a consistent environment, now it will be increasing harder (at least on Ubuntu). I can envision vendors spending more time updating their build environments than actually implementing their products.

What distros need to do is have periodic "releases" of the core build and libraries, with applications built on top released as they build. Then things like KDE and glibc remain stable, while we get to use the latest firefox or openoffice once they're tested to work in the core environment.

You can choose whatever dependancies you want. If the system doesn't have it they will be pulled down along with your application. Version changes in libs are usually minor and won't affect your program (except make it more stable via bug fixes).

Major version changes are usually installed side-by-side with the older version until the old version becomes depreciated and removed. Even then, there is nothing stopping you from shipping or requiring the older version. But if there are bugs in the older lib i

If you're talking about needing to "be able to release something every day" and you're talking about Ubuntu then the first few days are simple to sort, and I'm sure you could continue a pattern that would keep 90% of Ubuntu users happy:

Day 1: Lighten purple in default backgroundDay 2: Darken orange in default backgroundDay 3: Darken purple (but not enough to be back to original shade of purple)Day 4: Put orange back to what it wasDay 5: Make all window buttons red instead of red and greyDay 6: Put all window buttons back on the right side of the windowDay 7: Add a clock widget that uses boldDay 8: Bundle a load of random pictures, slap Ubuntu logos on them and call them "The Ubuntu Desktop Pack" (see Gnome-Look.org for examples)Day 9: Change the cursor theme so that it turns into an Ubuntu logo when hovering over a title bar - that's a feature, right?...

So, which one - if any - would be a LTS release ? I am really bothered about it. I am so far away from bleeding edge, that I want to change from one LTS release to another alone; let alone six-monthly !
And how about software developed for a particular release ? "Tested on Ubuntu release Nov 24 2010, 11 PM GMT+5" ?

My experience of Ubuntu Desktop LTS releases has been that that Supported consists of Canonical saying "Yeah, well, the good news is that it ain't getting any worse". And then they roll out an update that shags your drivers.

I know, I know, you get what you pay for, roll your own if you need it to be stable or up to date. But I don't think Canonical is really committed to LTS releases, and I'm pretty sure that individual developers aren't. Interest in N+1 tails off even before release, as all the cool ki

Since *nix tends to be case-sensitive, they can re-use the first 26 names without collisions, and it will still be in version comparison order. Then I expect to see "0-day 0liphant" and so forth. By the time we get to the plus, minus, and equals, Canonical will have sponsored the naming of 3 newly discovered species such that they can finish the cycle. At 2 per year, that gets them to (04 + 32) = 2036. That's enough time for John Titor to come back from the future to fix the 2038 bug once and for all, along with the Ubuntu naming conventions hopefully.

Considering that it is possible to push out small incremental updates via the daily update system this seems like a good idea to me.

I look forward to the days where I'll never have to install an upgrade again...

Gentoo has worked for me this way for years, and I presume many other distros can do this too. In fact, I rarely even use the install CD for new machines; if I already have a Gentoo box of the came architecture, I just copy the root filesystem and continue from there. Of course, a new kernel (for drivers) and other adjustments are necessary, but rarely a full install.

This is what I have been waiting for. After my initial excitement about (k)ubuntu release updates to get all the hardware running and supported, I am now at a state where everything is fine. The ongoing new 6-month releases are more of an annoyance than a great feature. Having to upgrade completely every 6 months just to get access to the latest software releases does not seem like a worth while effort. Sure, you can say that's what the LTS releases are for. But while the LTS releases do enjoy long-term sup

Maintainability: It's hard to maintain a system that has billions of different iterations — "billions" might be an understatement; your typical system has one to two thousand packages installed. If half of them get a nontrivial update every month (which is a pretty conservative estimate), that's still 125 million (500^3) combinations each quarter. The easiest solution to this isn't a very user-friendly one; make internal 'milestone' markers and force upgrades that would push beyond them to incremen

Really, I am a grown-up. I do not expect perfection. I do not hold update counts against developers, in fact, the opposite, shouting huzzah whenever problems get fixed.

But here's what has happened in the past few months. There's something I want to do, and I fire up the software, and I'm greeted with an update available modal box which has to be addressed, probably with a "later" because at that moment I hadn't planned to be sys-admin dud

Recently I tried to upgrade my Ubuntu system from 8.04 to 10.04 (LTS to LTS) by using the bundled distribution upgrade manager. The first upgrade, to 8.10, rendered my graphics card and video card useless. Since the 8.10 version was no longer supported, I had to jump through a bunch of hoops to get good drivers installed so the system was usable again. 8.10 to 9.04 went smoothly. When I upgraded from 9.04 to 9.10, upon system reboot, I was greeted with a message that one of my I/O modules had a

Next time either a) try updating LTS to LTS or b) simply install the new distribution over the existing install. With a dedicated home partition, the latter is incredibly easy, and if offers a nice middle ground between a clean install (losing all your settings) and an upgrade (keeping all those crufty packages you installed but didn't use). Even without a dedicated home partition, it's possible, just make sure you don't format the drive you install to (and maybe manually rm everything except/home).

3. Offer optional upgrades to the major application packages, drivers etc. as they become available and where possible, and keep interdependencies to a minimum - i.e. compile them against the original distro + any vital security patches, not the latest everything (statically link them if you have to - RAM is cheap now).

The problem with the current system comes for the less technical users who want to (or are sensibly advised to) stick with the packages in the official repositories. Currently, you may find that the only "official" way to get the latest office software is to upgrade your whole fricking operating system. Its like having to take the back axle off your car in order to replace the radio.

Remember this is Linux - if we/.ers want to compile our own kernel, install the latest Firefox beta from a source tarball, reformat the drive as ext6 or scour the interwebs for a suitable.deb of the very latest LibreOffice then there's nothing stopping us. Or, we can switch to a more bleeding edge distro. However, that might work for us, but it won't work for others - and even I don't want to install a new kernel just to run the latest word processor unless it really, really needs it.

The problem is particularly bad with Ubuntu: it can't be "the Linux for the rest of us" and bleeding edge, because "the rest of us" don't want to be obliged to upgrade our whole OS every 6 months just to get the latest OpenOffice.

...its understandable with commercial software where the company depends on brining in the upgrade fees, but why should Free Software care?

Use Red Hat/CentOS if that's what you want. (If Canonical is seriously thinking about this, that's probably why.) Most people want updates to the Kernel/Gnome/base libraries/etc... and they want to choose when to upgrade so they can do it when they have time to address issues. Regularly testing rolling packages together seems like a way to let people just apply security updates until they're ready to "upgrade" to the latest rolling package set.

I agree with the 3 points, which I summarize as "be more like Windows release cycles and Windows Updates" --seems to work well.On the point of Joe Non-technical User, though, I disagree, respectfully since I don't want to take away from your great post. There's geeks who can do the tarball/compile magic... and then there's everyone else, in the nontechnical side.

In the Window world, Nontechs using windows don't care what version of MS Office they have because they're extremely interoperable. The difference

If Linux is maturing as a desktop OS then there shouldn't be a need for 6 monthly, let alone daily, updates.

Unless one wants to use a new version of an application that relies on a new version of a language interpreter (e.g. PHP, Python, Perl) or a new version of a library (e.g. SDL_ttf, which added hinting and kerning controls in 2.0.10 but Ubuntu 10.10 is stuck on 2.0.9).

The problem with the current system comes for the less technical users who want to (or are sensibly advised to) stick with the packages in the official repositories. Currently, you may find that the only "official" way to get the latest office software is to upgrade your whole fricking operating system.

There is supposed to be a backports repository. What luck have people had with that?

Its like having to take the back axle off your car in order to replace the radio.

Why am I getting flashbacks of removing the battery in an N-Gage gaming phone to change the game card?

First thing I'd do is look for the off button when installing a release with this feature. Twice I've had an in place upgrade hose my Ubuntu system - and usually it results in quirky bugs if it doesn't entirely blow up. That's enough of that nonsense - every couple years I do a full fresh install and copy over all the important files from my old install.

It seems like Ubuntu is going the way of Firefox, Pidgin, and other open source software - making unilateral changes the users haven't necessarily requested or possibly downright don't want. Pidgin auto-resizing text boxes and Firefox magic navigation bar are easily on par with moving my window managers close/min/max button positions.

Lesson for open source: people are often happy with how something already is - put an option in settings to reset it to the old default when you make major cosmetic changes to your software. I wouldn't be using XYZ software if it wasn't already working for me. Thanks

You might be surprised by how many people are leaving Ubuntu over its terrible release cycles. "Want a new version of Pidgin? You can wait 6 months." "Oh, the 6 month release broke your [sound/video/anything], we'll fix it in 6 months."

Arch has used a rolling release forever and while things occasionally break (less often that Ubuntu ironically), the fix usually consists of waiting a couple days for them to push a fixed version.

Release early, release often... In my opinion its worked great for the Kernel folks since v2.6 was released. Does anyone remember the hell that was upgrading from 2.2 to 2.4, and again to 2.6?

The more things that change at once, the greater the pain will be, its as simple as that. Holding back all changes and releasing them all at once with a major version upgrade causes the most pain as possible, and people are reluctant to actually upgrade, so testing is limited.

Stability and a predictable environment are more important to me and my work
than having the latest bells and whistles.

Just because Windows is constantly slipstreaming updates doesn't mean Linux needs to do it too. If
something is really really really important, by all means tell me about it, but let me make the
decision whether to upgrade or not.

Upgrades for the sake of upgrades are not the answer. My main development box is Slackware 10.2,
albeit with upgrades to many development-related packages (ker

Right now they're using a system where the two words in the name always start with the same letter -- Feisty Fawn, Jumping Jackalope, etc. Once they run out of letters to use in this manner, they only have to use mismatched names. For example, the first version after Zaphod's Zipper could be April Bees. In this manner they should have enough letters to last several lifetimes at the least, even if they skip over combinations that don't produce anything you'd want to slap on a product.

Engineering director at Canonical Rick Spencer has replied [blogspot.com] to this story. He says:

Ubuntu is not changing to a rolling release. We are confident that our customers, partners, and the FLOSS ecosystem are well served by our current release cadence. What the article was probably referring to was the possibility of making it easier for developers to use cutting edge versions of certain software packages on Ubuntu.This is a wide-ranging project that we will continue to pursue through our normal planning processes.

It is not just forcing everyone on testing. It also changes the focus on debate. Now instead of users moaning for 6 months about not wanting their tabs moved from the right side to the left, They will get up one day, run update, and boom, the tabs will move. Then users can argue about not liking it, but it will have already been done.

It's not like this hasn't been done before by other distros. In fact, "rolling updates" was, to me, an advantage of Gentoo. I used to run RH and even RHEL, but I had two major problems with them. a) Too much effort to get sources to compile. For example, I wanted to try the latest wx out from a coding perspective. I banged my head against it for hours and couldn't get it working. Maybe it works now, but with the RH level I had, it wouldn't for me. And b) upgrades from one release to the next were woe

For alot of things Sid is still way out of date. I don't do much Debian work anymore but when I did I remember comming across tons of them and having to either roll my own Debian package for awhile awhile. This is certainly the case for lesser used packages.

That said, I wish to add a question - What exactly is the difference between say Ubuntu 10.4 and 10.10 ? What exactly is a 'latest upgrade' ? When they change the way things look and a few policies (such as the default media players?)

10.10 uses the 2.6.35 kernel by default instead of the 2.6.32 kernel and it has better driver support than 10.04 (more stuff just works).

95% of the changes between releases are completely cosmetic but there is the odd thing that makes it worthwhile, for me anyway.

I kinda got tired of the way ubuntu updated things and stopped using it. They just *move* stuff every 6 months and it was always something that annoyed the hell out of me. On top of that...the kernel updates were pretty regular, which means I needed to reboot. I was rebooting ubuntu a couple of years ago more than I ever reboot Windows now. Is that still the case? Seems like I was getting kernel updates weekly or bi-weekly.

Agreed. My old creative webcam *FINALLY* works. Now, if I could get Kopete to recognize it I'd be set.

Siderant: Why is it that Kopete can't seem to keep up with Pidgin on their little items when it beats the blue hell out of Pidgin in the big-time stuff like webcam (when it works!), theme integration, identity grouping and the like? I mean little things like 'sort by recent activity', 'last seen', and 'psychic mode'. It's the little things like that which keep me from going back to Kopete full time. That an