Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

MonsterTrimble writes "Ubuntu 10.04 LTS Beta 2 is experiencing a major memory leak due to patches for X.org. 'An X.Org Server update that was pushed into the Lucid repository last week has resulted in the system being slower and slower as it is left on, until it reaches a point where the system is no longer usable. ... In order to make the Ubuntu 10.04 LTS deadline, the developers are looking at just reverting three of the patches, which brings the GLX version back to 1.2. Ubuntu developers are now desperate for people willing to test out this updated X.Org Server package so they can determine by this Friday whether to ship it with Ubuntu 10.04 LTS or doing an early SRU (Stable Release Update). Right now this X.Org Server that's being tested is living in the ubuntu-x-swat PPA.'"

How come this wasn't caught when they were profiling? Notice I said "when" - the X.org people aren't seriously deploying patches to such a crucial app without profiling first, are they?

Because this isn't a patch or bug from the "X.org people". It's a patch ubuntu applied to x.org for GLX 1.4 support or something like that. So the question should be, why aren't the ubuntu people profiling before releasing patches.

I have profiled lots of things with Valgrind, and what you say is true. However, to the best of my knowledge, the xorg-server is still single-threaded. Mind you, I haven't been in the source in a long time, so maybe the docs are out of date?

I never understood backporting. Sure, it gives the illusion of stability, but you're relying on a much smaller set of developers, those for your OS, who may or may not understand the upstream code well enough to make smart decisions and having them glue code in and call it a "stable" release version.

Considering how much of the development actually comes from Red Hat, it's likely that at least they understand the upstream code well.

One reasons I will always pick Debian - stable over running Red Hat / CentOS with it's 3 year old versions of software that has "backported" fixes.

RHEL/CentOS software is only three years old if you're running a three year old version of RHEL/CentOS. The upcoming RHEL 6 will include newer software.

Red Hat probably spends more time testing their backported fixes than the upstream developers spend testing the original code.

Except for the June 2006 "Dapper Drake" release. I believe it was their first LTS release. They should delay this LTS release too. Who the heck wants a buggy, memory leaking X.org version, or an outdated version of GLX? Some advice Ubuntu devs: Wait. Get the bug fixed. Get it right, then release. The world won't end if Ubuntu is two months late.

I suppose some may argue that this calls into question the wisdom of Ubuntu's release schedule. On the one hand, having a rigid release schedule means that they are always scrambling to get everything in place on time. With testing times more constrained, more bugs may creep into the release.

On the other hand, the pressure of a schedule can get people fixing problems sooner than they would otherwise have. Ubuntu is under a time constraint, so they are asking for help with testing, and they are putting pr

The Ubuntu Wiki has details on this issue at the GEMLeak [ubuntu.com] entry. It provides instructions on how to upgrade to (and remove) the candidate packages in the PPA. This comment is worthy of note for those already on Lucid:

This does not affect cards using proprietary drivers or not using DRI2. Intel will always be affected since DRI2 is used with and without KMS, ATI uses DRI1 without KMS.

I understand that fixed release dates are useful for planning, but I think Ubuntu has put too much emphasis on them. Software should not be released until it is ready.

The idea of releasing it on schedule, with this big bug in it, and then issuing a quick fix when it is ready (one of the options discussed) is silly and rather deceptive. If what they have on April 30th is only beta quality then don't call it a release just so you can say that you stuck to your schedule.

Slips happen in real life. Vendors fuck up. Planes get grounded. The paperwork takes longer than you thought. You're just plain out of Iridium. The inspector wants Euros and you only have...never mind, the point is, Things Happen.

If they don't slip the date, then Ubuntu can never be trusted as a product ever again. What bugs will be in the next release, with a planned quick fix "right away"? I've always said that if your best friend, whom you would trust with your life, says, "I

If they don't slip the date, then Ubuntu can never be trusted as a product ever again.

That's a bit harsh, don't you think? If it is clearly a regression caused by a particular patch, they could release without that patch and just note in the release notes that a particular feature that was promised had to be dropped at the last minute.

Or perhaps you view the errata section in your daily newspaper as proof that it can never be trusted as a reliable news source again.:-)

"If they don't slip the date, then Ubuntu can never be trusted as a product ever again."

No problem: this very bug is a clear indication that Ubuntu people can't be trusted about software engineering to start with so there's no difference if they trash it a bit more regarding the schedule.

Just think a bit about it.

Why the release schedule is in danger? Because of a serious bug discovered at a late date.

Why such an obvious and serious bug was discovered at such a late date? Because it's due to a...

Actually they are speculating that they may release on schedule, without the bug or the enhanced features that the patch which contains the bug provides, and then later issuing an update which includes the extra functionality once the bug has been fixed then properly tested and verified.

Ubuntu has chosen for a fixed release, it is a tactic, one of many to deal with the reality of running a Linux distro.

Others do a rolling release, this means they can release a new version of any package when it is ready but means you are near constantly updating and if you don't, you risk missing out on a change that turns out to be essential (going form 6-8 might miss an essential config from 7).

Ubuntu however now faces a near impossible choice of which version to go for. If they wait other packages wil

The two choices are scope-based releases or time-based releases. Scope-based releases allow for long delays, reduced confidence and morale. Time-based releases have been shown to be an effective tool in improving the quality and morale of large, complex open-source software.

You talk as if Canonical could have every team dance after their schedule like Microsoft can with their teams, but they don't. There's always some semi-important package be it the kernel or X or Gnome or KDE or OpenOffice or Firefox or all the server packages and so on that doesn't align well with their schedule and have some versions that are sorta but not really ready for release and if you keep waiting you end up with Debian that has delays longer than Ubuntu has between (non-LTS) releases. Every six mon

If you want the latest bleeding edge packages, you risk a lot of instability and potential for breakage/bugs. The new packages need testing, especially when they are all combined together as a distro does, ideally on as much hardware as possible. This is the position that Debian-unstable and Ubuntu are in -- they sacrifice stability for being up-to-date.

If you want a rock-solid system, you tend to use older packages that are more mature and have gone through an extensive stabilisation period.

Ever since I upgraded Ubuntu to v9.4 last Spring, my x.org has been crashing it anywhere from startup to a couple days uptime. There's no signs of trouble in the syslog, or any other logs, no signs of trouble anywhere until it freezes (cursor screenfreeze, but background processes like wget piped to madplay for streaming usually continue). I know it's x.org because if I disable (only) x.org and leave the console-only version running, it doesn't freeze even after a few days.

Have you been able to find any signs of other users having similar problems? If not, then my experience strongly suggests that it's a problem specific to your system, either the software configuration or the hardware. Problems with a vendor tend to show up with enough users to create a good deal of Internet traffic on the matter.

For example, one system I used would crash hard intermittently -- sometimes multiple times in a day, sometimes only after several days of use. Red herring #1: For ages I thought

If you read the wiki page referenced carefully, it would seem that the general consensus is that the bug is fixed in the testing packages.
https://wiki.ubuntu.com/X/Testing/GEMLeak [ubuntu.com]
Seems a bit blown out of proportion to me.

10.04 is supposed to be a LTS release, and they are nearing their deadline. Roll back to the "stable" version of X, and push these patches forward to 10.10. Anyone who cares about having the latest and greatest will roll along with the 6 month release cycle.

9.04 was not a LTS version; 8.04 and 10.04 are [ubuntu.com]. However, had you said "8.04 is still of bugs that nobody gives a damn about", then the rest of your comments would be correct. The biggest question in my mind is why in the world they don't use the LTS versions to at least put on a facade of stabilization focus for. As an example of the ridiculous changes introduced into the last LTS, 8.04 introduced PulseAudio [ubuntu.com] in a very buggy form, making that LTS unsuitable to use for anyone who needed sound during its en

This isn't the only video problem in the Lucid Lynx betas. Since upgrading, I've been having a problem [launchpad.net] where x.org sometimes fails to start up when I boot. Presumably this is a separate problem from the one described in TFA, since you wouldn't expect to see a memory leak's effects showing up at boot time.

Jaunty and Karmic were really terrible releases, IMO. The good news for me is that sound, which broke when I upgraded to Jaunty, is now working for me again with Lucid. I'm hoping that Lucid gets nice and stable over the long lifetime it will have as an LTS release. In the past, I'd been upgrading ubuntu steadily rather than waiting for the next LTS, mainly because I wanted my apps upgraded. That was such a miserable experience that I'm planning not to do it anymore; I'll just stay with Lucid until the next LTS.

I like debian and ubuntu better than the other OSS systems I've used (Mandrake, Red Hat, FreeBSD), but this close tie-in between updating apps and updating the OS can really be a pain. The OS-level tweaking has never made my life any better. As a user, I couldn't care less about stuff like OSS versus ALSA. I would really love it if ubuntu would focus more on fixing bugs in the OS while keeping applications up to date, but not gratuitously breaking stuff in the OS just because they want to be on the cutting edge.

Another thing can be a drag about ubuntu is that they aren't very careful at all about keeping Gnome separate from the underlying OS. Anyone who uses a WM other than Gnome with ubuntu is going to run into lots of things that don't work properly, because the developers always seem to feel free to make changes without testing them on any other WM. For example, here [launchpad.net] is a bug in xsplash. It causes problems for people who aren't using Gnome. You know you're in trouble when you have functions whose names begin with "temporary_hack..." This one was not a bug in a beta, BTW, but a bug in a real release.

I've been running lucid for a few days, and I think quality control has significantly slipped in Ubuntu. Yes they are only betas, but with only 2 weeks left before release, I have seen lots of bugs still remaining. Within a couple days I found that screen-saver crashes often, several apps can't properly auto-disable PulseAudio anymore and don't work without hacks, PHP 5.3.2 segfaults, themes didn't install fully on upgrade, and (of course) the memory leak which results in Lucid using up all the RAM in my

Ubuntu developers are now desperate for people willing to test out this updated X.Org Server package so they can determine by this Friday whether to ship it with Ubuntu 10.04 LTS or doing an early SRU (Stable Release Update).

They should have thought that before antagonizing over 80% of the tester community with the windows button issue.

Yes, it IS a petty issue, the problem is that everybody said "We don want it, please revert pretty please" and Mark was like "Thank you, your opinions are very valuable, however, just bite it".

So I'm not surprised at all if the tester community feels withdrawn. There is a growing feeling that the opinions of the community are being soundly ignored, for instance these (public) statements from the bug tracker I'm going to reproduce without permission:

Jef Spaleta:

First of all I think you put too much weight behind Brainstorm as a toolto drive change inside Ubuntu. You actually shouldn't be at allsurprised that Brainstorm popularity has very little influence overdesign decisions. It's never had influence in any technical decisionmaking and no one in a position of authority inside Canonical or Ubuntugovernance has ever claimed that it has. Canonical nor the externalUbuntu governance structures make it a policy to rely heavily or to evenofficially review highly popular ideas in Brainstorm on a regular basisor part of technical decision making or public governance discussion.Were highly popular Brainstorm ideas even discussed in an organizedsession during the UDS in the run up to 10.04?

The track record of implemented ideas backs up my point. You lookreally closely at the ideas marked implemented in Brainstorm and theyare at best mediocre in terms of Brainstorm popularity. None of thehighly popular ideas in Brainstorm get implemented..or even discussedpublicly as a matter of technical decision making or governance. Takefor example the music store idea. It has a negative voting total and ismarked implemented.

It's wishful thinking to suggest that Brainstorm popularity plays animportant role in decision making. It doesn't. At best brainstorm is adumping ground for random ideas. There's no evidence that the votingprocess correlates with feature development or decision making at all.

The thing is, Ubuntu has dropped the ball massively with this release, there is simply nothing good about the new release, worse still is that it lost contact with its user base, most of the decisions are now either politically or corporately motivated, or driven by the team of Cupertino rejects that Mark appointed to drive Ubuntu development.

But really, this is interesting, I'll get some marsh mellows and enjoy the fireworks. The question no longer is if Lucid is going to be an embarrassment but whether Mark will learn anything from it. If Mark learns a lesson it's well worth it.

I really loved ubuntu, I want to love it again, but right now, I'm just deciding whether to switch to mint or debian.

I dunno about any button issue, but a couple of versions back they pulled the very nice update widget from the system tray, and replaced it with a horrible Apple-style distracting popup and a 'notification area' which is a usability nightmare (big black popup every time my network status changes, for instance, which hovers for about 30 seconds right over my eyespace but is not clickable to say 'yes I know stop bugging me').

Lots of people flagged these 'upgrades' as bugs, explicitly requesting reinstatement of the old, working, behaviour, but Mark himself came on the bug system to say 'no, we're doing it my way'.

Yeah, it was supposed to be a "feature" that you get a notification that you have new mail, a download's complete, or whatever, but you can't click on it to open the notifying program. You have to "know" which program is the one that sent the notification, and search for it on the taskbar/window list, possibly on a different virtual desktop. Hard for newbies, and annoying for oldies.

>It's not an isolated incident. It's a pattern.

That's the main point. It's a red herring to say people are upset over a bun

I've been an open source user and developer since long before there was a Linux. And, I've been a Linux user for a long time. Used Redhat, Debian, and now Ubuntu. I've been using Ubuntu since 5 something. I like Ubuntu. It is easy to install, gets easier all the time. It works, which is really nice. And, it has very good support for things like Flash and proprietary graphics card drivers. You can complain that it doesn't have some detail covered that is critical to you, but that's OK. I've been very happy with Ubuntu.

Well, I was. I always try to test the alpha and beta releases. In the early days I could down load the first alpha and it would work. It might get a little weird, but it would work. In the worst case I can remember the computer would at least boot up to the command prompt. That is until the 10.4 release. That just plain wouldn't boot until we got to alpha 3. It wouldn't even install. It has been awful ever since. I don't know if it is a problem with X.org, but every time I type in the search field on firefox I get a black screen. After a few seconds the login screen comes up and I can login. The machine did not reboot. It looks like typing in the search field on firefox is crashing the X server. Now, back in the early '90s I helped get a little program called xcrashme written and distributed and after that was around for a few years the X server was damned near bullet proof. What did they do to mess it up so badly? I went to file a bug report. It turned out to be a duplicate. Seems a lot of people have reported the problem. I haven't seen any action on it.

Then there is the little thing about the user interface in 10.4. Nobody in their right mind, at least no body who had any respect for their users, would change something as basic as the location and order of the window buttons. But, Shuttleworth has done just that. The reason? To make room for a "cool" something that will appear in a later version of Ubuntu. The only discussion involved in the decision was the coolness of the feature and the vague technical argument that somehow it reduces mouse movement, because the buttons are now on the same side of the screen as the menus. Oh, yeah, like the amount of time anyone spends opening new apps is worth retraining your hands to find the new buttons. On the bug discussion list Shuttleworth would not even admit that human factors might have some validity in the discussion. Only the coolness and the bullshit argument about mouse movement were treated as worthy of consideration. Shuttleworth even posted data showing his own mouse movement. The data did not support moving the buttons. But, he claimed it did. He saw what he wanted to see. After all, the new thing is so cool we should all be grateful for the inconvenience.

Why doesn't Ubuntu care about the effect the change will have on their customers? Because they have no customers. They are in it to be cool and to score techie points with other people who do not understand why proprietary software actually tries not to piss off their customers. If you don't believe me ask a human factors engineer why purple is an awful background color for a GUI and then ask what percentage of the public can read light gray text on a dark gray background. Then look at the new Ubuntu default theme. It sure is "cool". I used ssh -Y to log in from a computer with a different theme so I could work select a readable theme and move the buttons back to where I'm used to having them.

The backlash from the users has been astonishing. Even more astonishing is Shuttleworth's "I'm to cool to care" attitude.

At least for now you can move the buttons back and choose another theme. What happens when he puts his uber cool new feature into the UI? I guess I am looking for a new Linux distribution.

That was bad enough... But, then I ran into OO.o Issue #956 (http://qa.openoffice.org/issues/show_bug.cgi?id=956). Have you heard about this one? It was filed May 25, 2001. For comparison current issue numbers for OO.o are now above 110,000.

Note: I am also a teacher and I also hate OO.o. It is feature rich, but bug filled.

Now, I will also say that I used to work professionally (when I was a programmer) on a proprietary office suite that you almost certainly know. It is also feature rich and bug filled. Every day there would be a prioritization of new features over bug fixes. The next version of the software requires new features (even if your product is already overly feature rich) otherwise nobody will buy it. Nobody wants to pay for b

9.10 works like a charm on my netbook, and I have 8.04 (the last LTS release) running on a few servers. The length of security patch support on the LTS releases is quite attractive for servers that don't need to be bleeding edge.

The length of security patch support on the LTS releases is quite attractive for servers that don't need to be bleeding edge.

Not compared to Debian.

I operate a number of servers running Ubuntu, due to decisions made in the past. Inertia is enough to keep us on the platform, in the sense that I don't object strongly enough to go through the pain of migrating them to another distro. The servers run well enough, I suppose, but there's nothing particularly attractive about running Ubuntu on them.

Where servers are concerned, conservatism is a virtue, and Debian Stable is my favourite brand of conservatism. I find it philosophically unappealing to be running on Testing and/or Unstable (which, effectively, is what Ubuntu is) because the benefits don't outweigh the liabilities. Happily, my servers have behaved well so far, in part because I use minimally simple configurations, I check everything that happens on them all the time and I read the changelogs before I patch.

On the desktop, however, I quite like Ubuntu. Pushing out closer to the edge in order to get better hardware support and cool features really appeals to me, because the promise of an improved user experience makes it worth enduring a few nagging issues.

That said, Lucid and Karmic have a few bugs that are really silly. One recent one is the Edit Network Connections applet which (rightly) disables the 'Apply' button when there's only partial address information, but never re-enables it. This is a really basic programming mistake, and frankly I'm amazed it was never caught. Issues with removable devices have become increasingly bothersome as well. Karmic saw intermittent problems mounting CDs as well as USB disks and flash drives.

Most -if not all- of these issues can be laid squarely at the feet of the GNOME devs, who seem to be making more and more amateur mistakes at every release. I'm starting to wonder if they have any QA & testing environment at all. But Ubuntu has made its bed by tightly aligning itself with GNOME's release schedule, so they get to share the blame.

As a poster just below observed, becoming popular makes you a target for criticism. I don't really see a problem (or a contradiction) there. While I support Ubuntu and suggest it to anyone who asks, I still think that prominence means that they should be prepared to meet a higher standard and to address such criticism effectively.

Full marks to them, by the way, for getting out ahead of this issue. If this were a proprietary OS, we'd likely have to wait for the first Service Pack before this issue was addressed. (And of course, it wouldn't be documented except for numerous blog and forum posts peppered across the Web.)

The length of security patch support on the LTS releases is quite attractive for servers that don't need to be bleeding edge.

Not compared to Debian.

Where I don't have Ubuntu running I have Debian, but while Debian/Stable is arguably more stable than Ubuntu LTS and often more up-to-date releases do drop out of security support sooner than Ubuntu/LTS releases.

As reliable as upgrading Debian is (I've bumped many machines Woody->Sarge, Sarge->Etch, Etch->Lenny and so on with no serious problems and the only minor ones being my responsibility) a remote full distro update (kernel + libc + everything else) is still something I'd prefer to do less often if possible and Ubuntu's extended support period increases the chance that a machine will be decommissioned and brought in for rebuild/repurposing before it is needed at all.

So on physical machines that not usually local to me I generally go for Ubuntu/LTS. For servers that are local (so can I can get to a physical console if there is a nasty issue) or need to be a little more up-to-date usually Debian/Stable. For the desktop/portable, usually a recent Ubuntu. Both are good distributions and even these days very similar from a server PoV (I've not used Debian on a desktop/laptop for a while, they may have diverged more in that arena) so the choice comes down to where we are in release cycles, how up-to-date the machine needs to be in terns of package versions, and how remote a location the machine is going to live in.

Full marks to them, by the way, for getting out ahead of this issue. If this were a proprietary OS, we'd likely have to wait for the first Service Pack before this issue was addressed. (And of course, it wouldn't be documented except for numerous blog and forum posts peppered across the Web.)

Full agreement there. Even certain other players in the Linux market might be less "good" in this respect. While Debian's mailing lists can be a brutal place to exist if you are neither omnipotent nor immortal (or at least flameproof), both their core contributors and the Ubuntu equivalents seem to attribute openness the value it deserves more than most do.

It's funny how when a FOSS project gets to a certain level of popularity (Firefox, Ubuntu) there seem to be a vocal group of people that try to tear them down. Oh my god, a version of Linux that is nearly user friendly, it's not hardcore enough for me!

When I first heard about Ubuntu, I thought to myself, "Great, a user friendly Linux distro!"

When I first heard it I thought "that's the stupidest fucking name I've ever heard".

Then when I first tried it I thought "Man that is WAY too much brown and orange.".

Overall though, if you ignore the name, and change your theme around to something a bit more pleasant, it's really pretty slick. If anything has a chance to get people adopt Linux for general usage, Ubuntu is it.

Either that or LinuxMint, which is effectively "Ubuntu with the ugly removed".

Overall though, if you ignore the name, and change your theme around to something a bit more pleasant, it's really pretty slick. If anything has a chance to get people adopt Linux for general usage, Ubuntu is it.

Either that or LinuxMint, which is effectively "Ubuntu with the ugly removed".

Kubuntu is a little bit prettier with it's KDE interface and still has the same polish, but I don't think anyone who is trying Linux for the first time would grab it over Ubuntu (as it's not that well advertised, I'm sure partially to not confuse first time users).

right up until you have hardware that needs something special to work, for example, "if up eth0; mii-tool -A 10baseT-FD,10BaseT-HD ; $(get dhcp lease" Skip the mii-tool step and the card drops 98% of packets, use it and it is rock solid. nice and easy in gentoo using the post-up function. Pain in the ass on debian/ubuntu.

Um, what?Debian (and by extension, ubuntu) has had the ability to add things like this to/etc/network/interfaces forever. I've been adding routes, setting up VPN bridges, enabling firewalls, setting up UPnP routes, and doing all sorts of things with that for ages. Adding in a command to run mii-tool sounds trivial.

Did you not know about that file, or was there something about it that was undiscoverable/not doable for you?

My biggest issue with it is that there are essentially no updates in between releases. NONE. Now I don't need bleeding edge, but it would be nice to not have to wait for the next release for updated software (to mean in this case minor bug and security fixes).

You can get closer to the bleeding edge by first turning on the official backports repository, where packages heading for the next release are also backported to the current. Tends to be more useful in select occasions on an LTS release though.

Or better still (but slightly riskier) is to see if there's a PPA from a trustworthy source for any packages you particularly care about. Some of the newer desktop apps really benefit from this. You just add it to your list of sources and updates are pulled in along w

I did that when I was running Debian (Woody) and had nothing but problems with packages not installing (all binary sources, in this particular case it was for the next release of KDE (I believe it was 3.0 or some such) and I was using the KDE repositories) because of dependencies. No thanks.

I don't either. I'm only using XP now because the computer I'm using came with a fresh, legit, OEM copy of XP and I'm only paying $100 for the whole rig. I'm probably going to eventually do a binary install of FBSD-8.0, but for right now I'm just happy to have a computer (my other computer is having hardware issues).

I don't know... removing programs with Synaptic or Ubuntu's Software Center always seemed fairly straightforward to me. Heck, I'm running OO.org 3.2 on my laptop and it has Ubuntu 8.10 installed on it, which originally came with (if I remember correctly) OO.org 2.

It's not just FOSS projects, but pretty much anything that people think they can use to make themselves unique. Music, movies, books, cars... There are always groups of people who were the "first" to enjoy something, then when it becomes popular they begin to loath it for no other reason than it is popular.

The majority of the complaints against Ubuntu that I have seen do not deal with the popularity or the user friendliness.

Instead, they focus on things like the poor signal to noise ratio in support forums, and the cowboy, flying by the seat of the pants approach they take towards to the X server. There's far too many critical Xorg bugs in most releases, and this usually stems from all the extra patches they apply to Xorg and their strict adherence to release dates.

I think most people's issue with Ubuntu is it's overhyped. It is not anywhere near idiot-proof enough nor stable enough to be the everyman's Linux, yet it is preached as the holy gospel of Linux usability. Every one of Ubuntu's failures is amplified tenfold, due to its high visibility, making it all the more embarrassing and demoralizing. It is also difficult to support due to its audience of often less technically inclined users.

I agree with your sentiment, but I think there is something to be said for small, light software, and that most OSS projects begin small and light for obvious reasons and then mature into huge, bloated pigs like Firefox. People who like small, light software are forced to continuously downgrade to newer, shittier software. This even happened to Scheme, which started out as a programming language whose spec could be printed on a handful of pages but which recently ballooned into three or four documents addin

Yes, I know it's hard to believe, but the most popular desktop Linux distribution on the planet is indeed still in use. We're still waiting on a report to confirm if anyone is still using Windows or drinking Pepsi.

Yes, I know it's hard to believe, but the most hyped desktop Linux distribution on the planet is indeed still in use.

Fixed that for you. Fedora claims about twice as many active users as Canonical does, and most people consider it to be a desktop Linux distro. Obviously differences in counting methods make it hard to tell with any certainty which one we can accurately call the most popular, but perhaps many of us can at least agree that Ubuntu is undoubtedly the most hyped Linux distro ever. I don't say that to denigrate Ubuntu; I think Canonical's marketing arm is doing fabulous work.

Still, my original question was appropriate. I’m not asking why they sent that to their developers; I never was. I’m asking why it’s being reported on Slashdot before anything really is known about the problem.

To answer my own question, and this is something I only realised after reading the other comments on this thread... it appears that this is news and it’s being reported on Slashdot primarily because of speculation that it has the potential to delay their hard-and-fast 6-month r

This is the reason why hard release schedules kill Ubuntu. The devs slipped 6.04 to 6.06 for similar reasons, and the release was great. Contrast that with the scramble to get 8.04 released on time and then look at the mess it was in when it was delivered. It wasn't stable until 8.04.1. Ubuntu needs to be more flexible. Slip a month, fix this problem, then release. No biggie.

I can understand their zeal with keeping to a consistent release schedule. One of the reasons for starting the project was that Debian releases were slowing down and unpredictable. The steady release schedule is somewhat like public transportation: nobody will use a bus or train system if it operates on an erratic timetable, no matter what the benefits are.

That said, I think you're right about letting the LTS slip a month or two here or there. But that's probably unnecessary for their other releases.

Ubuntu's problem occurred because they have a shipping deadline and a really bad bug got inserted late and detected about a week from the scheduled release. So there's not much time left for testing a fix, another if the first fails, rinse-and-repeat...

Deadlines: "The light at the end of the tunnel is an oncoming locomotive."