Posted
by
Zonk
on Wednesday April 16, 2008 @05:55PM
from the it-is-a-very-robust-bird dept.

desmondhaynes writes "Is Linux ready for the masses? Is Linux really being targeted towards the 'casual computer user'? Computerworld thinks we're getting there, talking of Linux 'going mainstream 'with Ubuntu. 'If there is a single complaint that is laid at the feet of Linux time and time again, it's that the operating system is too complicated and arcane for casual computer users to tolerate. You can't ask newbies to install device drivers or recompile the kernel, naysayers argue. Of course, many of those criticisms date back to the bad old days, but Ubuntu, the user-friendly distribution sponsored by Mark Shuttleworth's Canonical Ltd., has made a mission out of dispelling such complaints entirely.'"

>and we've known it for a long time...
Not really. If that was the case, what was the necessity of Ubuntu? Ubuntu has definitely made Linux easier to install and use which was definitely not the case until like 2 yrs ago.

Ubuntu (7.10) still has its own shortcomings in configuring things like Bluetooth or Wifi which I hope will not not be there in 8.04 release.

Wifi for me works great -- never a second's trouble. Bluetooth has gotten better, but I still can't browse my Blackberry. I can detect it, exchange passkeys, and connect very easily through the GUI, but the OBEX still barfs.

If you've bought laptops that many times and tried Linux on all of them, then why haven't you just picked a laptop with supported wireless hardware at some point? I mean - Intel brand wireless that *works perfectly* is a required part of the Intel Centrino(tm) platform - it's not like it's rare or anything.

Seriously, it's like you're punching yourself in the face and complaining that it hurts. I'm not feeling much sympathy here.

Why bother with Windows when NetBSD supports any 32-bit microprocessor?

Finding wireless cards that work great under Ubuntu (or processors that work great under Windows) isn't hard. It makes a lot of sense to select the hardware that you need to run the software that you want to use.

Bad luck i'm guessing. I got an inspiron 9300 (Dell) when they 1st came out. It ran wireless fine off of a live gentoo cd. Then i tried ubuntu like 1year later which also worked fine, and i've had it run on a suse live cd even. Though my brother with a different dell was unable to use wirless out of the box he needed to install a fix for his card.

I think this is more a market share issue. When linux gains in numbers i believe companies might start to pay attention and have linux drivers released.

No, its getting OEM's to install it that's the trick. Once dell asks you to pay an extra $50 for Vista instead of Hardy, we will start to see Ubuntu pick up some momentum. When there is a price difference, AND an alternative for the consumer when they purchase, the choice is in their hands. Until then, 90% of consumers are just going to work with whats already on their computer.

Get back to me when you do that without broadband. My grandparents use dial-up since it's their only option.

Lots of software comes on the Ubuntu DVD and can be installed without broadband - including enough single player card games to provide for any grandparents. Hell, if you're willing to leave the computer up overnight you can even install large software packages from the repository over dialup - I've done it myself. But dial-up is really obsolete technology at this point. Even Windows just assumes that

What we do see, however, is that devices like EEE PC are making people aware that there is a choice and that Linux is real. Here in New Zealand we can buy laptops preinstalled with Ubuntu in regular retail shops http://www.dse.co.nz/cgi-bin/dse.storefront/48067b6603694d34273fc0a87f3b067e/Product/View/XC5822 [dse.co.nz]. These have been quite popular. They are still quirky: for example setting up wireless is a bit messy (not as slick as windows) and the power management sucks a bit.

I run HH on one of these laptop that came installed with GG. For the most part, I don't think that HH vs GG is much of an issue for adoption. What is important is that distros like Ubuntu are very easy to use/update and that devices like Eee PC are exposing more people to the option. Soon people will be asking for Linux preinstalled on higher spec laptops and we'll see more choice.

I used to use linux as my primary desktop but next time you think linux is ready for the masses I want you to go to a slightly above average windows user and.....

1.) Explain why their pda will no longer sync with their calendar, mail client, or transfer files
2.) Explain why they can't just plugin more than two monitors and just get it to display without editing config files
3.) Explain why they can't use that one application they NEED for work that only runs on windows.
4.) Explain why they can't play [latest high end game]
5.) Explain why [latest high end hardware] doesn't work in linux at all.
6.) Explain why their cheap no name printer doesn't work with linux out of the box.
7.) Explain why the pptp linux client is such a pain in the ass to use.

Before you go into some detailed explanation about how the evil M$ empire is preventing interoperability or how linux is so much more secure and stable remember your average user doesn't give a damn. They want to work/play and they can either do it right away or they can't, excuses and explanations don't matter.

Excuse me? Ready for the masses? Where? I've been working in IT for years and tinkering with computers for about 15 years now so I'd say I've come across my share of problems and I've mostly been able to solve them myself or with the aid of, then, AltaVista and google.

That being said, I tried to install Ubuntu a month or two ago. Well, it appears that the graphical installation is shot. Whatever I did, it wouldn't run on my now at least one year old machine. So I had to download the text install version.

I have two SATA Harddrives in there. One houses XP which I won't get rid of until Cedega actually manages to run ALL my effing games without 'minor problems'. Did you know that I've been partitioning with the likes of fdisk and cfdisk back in those days? Did you know I've been able to do a dual boot as a sixteen year old kid back in those days with an ancient version of SuSE?

Well, don't go believing I was able to partition the disks the way I wanted with Ubuntu because Ubuntu is made for the masses and the masses obviously don't have a need for partitioning more than one drive because the drive I wanted to partition just didn't show up.

What did install eventually was Mandriva. And it worked... mostly. Except I have two monitors with different resolutions... Man, THAT was unpleasant but after days of scouring message boards and trying to get familiar with xorg.conf I managed even that. My scanner isn't supported in linux it seems, so there goes that idea.

Frankly, perhaps it's just me but on every damn try I run into stupid little problems which take me hours or even days to solve. As long as that remains the case, Linux for the masses remains a myth. As long as we don't have doubleclickable install files that guide us through software installation, as long as we have to set up repositories and work with dependancies that go beyond "you need Java!" Linux is definitely NOT ready for average desktop users.

And to those who'd like to mod me a troll, I'm the first person throwing a party the day I can just replace windows with linux. But at the moment I don't have the time to spend hours tinkering with my box. I need that damn piece of equipment to just work.

Oh for crying out loud, how long ago was that? You've been bitching about that for YEARS. I don't even know how you managed to brick your system, I've put GRUB on the primary MBR of dual boot systems hundreds of times. I just did it with the new Ubuntu with a test server here in my office. No problkems. Last week I told my boss's boss how to help his kid do it. No problems.

I've read and followed the new Ubuntu dual boot instructions, a blind chimp could do it, you just clicky along 'till it's done, taking the defaults. It even resized my original windows partition with no problem.

I'm bored enough to argue pedantics this morning. According to the etymology of the word, something is "bricked" when it becomes like a brick; that is a solid and un-operable item best used as a paperweight, doorstop, or building material if you are so inclined. It is very similar to a "coaster" when referring to a burned CDR or DVD (and expanded to any disc these days) that is rendered unreadable. Since a computer with a bad MBR can be recovered via user intervention, even if it requires the user to go to the local software store and buy Norton Disk Doctor or whatever is en vogue with kids these days, it most definitely does not serve the same function as a brick.

The first time I heard the term, it was used to refer to DD-WRT installation. In this case, the only way to rewrite the OS of a commodity wireless router is through the router's own internal software update mechanism, which requires an existing functional OS. It does not have the capability of booting from any external media. If you try re-writing the OS and encounter a failure which causes the device to no longer boot, then you no longer have any means to restore a functional OS to the device. At this time, there is no software in the world that will allow you to fix this problem (if there is, then I would welcome the news). Your only option is to send it back to the manufacturer and claim that it "just stopped working one day" and hope the price of doing this is less than the price of a new wireless router. Until such time, there is nothing you can do in a user serviceable manner to restore wireless routing functionality to the device.

My webcam worked easier in Ubuntu than Windows XP.My TV Tuner worked easier in Ubuntu than Windows XP.And my laptop was pretty easy to get working with 1280x800Never tried a temp sensor.You didn't ask, but my printer worked MUCH better in Ubuntu than XP (somehow, Windows glitched and printed 20 pages of gibberish on the install when I hit "print test page" and got up to get something to drink, figuring now was a good time after setting up most of XP from a fresh install... I haven't risked wasting toner to fix it and just print from Linux)

I won't argue that it's easier than Mac OS X -- never used it -- but I will say it worked for me better than Windows XP. (before ubuntudupe or anyone replies yapping at me, please note the "for me" clause there...)

There are many issues with the grub boot loader and it is well documented. It is actually quite easy to mess up your system and make it appear to be bricked.

The most common version of this issue has to do with when a new kernel update occurs during a NORMAL ubuntu update. You click the orange star, choose update, let it download and install, then reboot and voila...no more access to any hard drive. It happens to my machine every single time.

I have to get out my liveCD boot with it, mount the main HDD partition, go into the/boot/grub and edit the menu.lst to change everything back.

If you have customized your menu.lst you will also have problems the next time a kernel update happens as the update will wipe out your customization, so if you have modified the menu.lst file to make change the order in which the menu displays your choices and which os is the default, that will be wiped out and you could loose access to one or more of your partitions (hence OSes). I have see this repeatedly, and in the latest situation I have had to turn off all updates so it didn't brick this retired gentleman's system.

On my system it changes the hard drive number and I have to either boot with the livecd or remember to modify the menu.lst before I reboot the computer. Total pain.

Now I'm not supporting the idea that the installer bricked his unit. It didn't. I'm saying that making this sort of error and letting it stand for years without being addressed and then tossing it back into the face of the user (who just might be a retired friend who knows little about computers) is not the way to go about marking your product.

Now I'm not supporting the idea that the installer bricked his unit. It didn't. I'm saying that making this sort of error and letting it stand for years without being addressed and then tossing it back into the face of the user (who just might be a retired friend who knows little about computers) is not the way to go about marking your product.

It worked for Windows, which just eliminates the previous MBR without asking any questions at all.

Somehow, Ubuntu is being flamed for this, even though it puts a lot more effort into playing nice with other OS's than Windows - which nobody seems to criticize here at all.

'Bricking' is when you fubar your BIOS upgrade, or touch a hot wire to some random contact on your motherboard. It means the whole thing is totally and utterly up a creek and it can't be rescued at all.

Rendering your system unbootable however is something else entirely. Although you may have screwed up the data on your hard drive to the point of no (or really expensive) recovery the system as a wholeâ"and even the driveâ"are/is still 100% usable with a little bit of work; mostly/all in software.

The most common version of this issue has to do with when a new kernel update occurs during a NORMAL ubuntu update. You click the orange star, choose update, let it download and install, then reboot and voila...no more access to any hard drive. It happens to my machine every single time.

Why not set the kernel-related packages to 'hold' status? Then, they won't be automatically updated (the updater will say "package X has been held back" or something to that effect).

When you're in the mood to update the kernel and perform all the liveCD booting that entails, you can manually update that package (unhold, install new version, hold again).

Meanwhile, you can still leave automatic update on to get all other updates automatically without fear that you might be in for a surprise dose of liveCD booting and MBR fixing (which I agree is very annoying if you had stuff to get done when it happens).

If you read the comments in menu.lst, you would notice that you are supposed to customize kernel parameters etc. in the commented out sections of menu.lst. On kernel upgrade, the software management system actually looks at those commented out sections and applies your customizations on the kernel boot lines, which it maintains itself.

If you customize on the uncommented actual boot lines, then yes, the customizations will be overwritten, because the computer has no way to know which part of the line need

January 06. Dude, there have been four major releases since then and a fifth is on the way. I didn't use Ubuntu until August 06, so I caught the one next up from you. The difference between that and what I have now? Pretty much astounding. I can't imagine how crappy the version out in January of 06 must have been.

Also, just so you know, if you don't have a floppy drive, you should have a bootable CD-ROM. Otherwise you're just asking for trouble. And it used to be standard operating procedure to have a boot disk of some sort. Windows CDs are bootable.

How the fuck do you actually brick a PC by installing an operating system? Maybe if the OS is evil and directly fucks with the flash memory on your BIOS (which Grub does not do). I would suspect that something else went wrong, and you're dealing with proximity in time (and yes, I've had that happen to, having a hard drive crash just as I was rebooting after installing a service pack in Win2k, and spending an hour thinking the installation had fucked up).

Where's Windows NT's instructions? They amounted to "insert boot floppy in drive A and follow prompts". I don't recall any instructions on what happens if NT does boot after it's installed the base system. Could I have blamed Microsoft? Probably. But because I knew what I was doing, and had seen similar failures enough times, I knew generally what the issue was.

Inexperienced users shouldn't install operating systems, unless (and this is the caveat) they're prepared for when things don't work. That is how we learn. So instead of railing on (and, it appears, miscategorizing what happened) chalk it up to experience. At least you haven't blown hardware, which I have done in the past. Your attitude bespeaks somebody who simply didn't have the basic knowledge sufficient to install any operating system.

Perhaps you could point to where newbies can easily find out how to fix the MBR when Windows screws it up. My point is that the minute you decide on an install/reinstall beyond the sort of recovery disk methods you get with a lot of brand name computers, there's a chance it can cause exactly what happened to you. Generally people who don't understand this probably shouldn't be doing any OS installs on their own, period. Everything installs 95%+ of the time fine, but even consumer-friendly products like Windows can get really fucked up, and it doesn't even take an install, I've seen failures because a service pack didn't install properly. There's a point at which someone who doesn't know enough should either not be doing it, or should be prepared to call for help (potentially having to pay $$$).

Does it still require you to edit a configuration file in any situation?
Right.
It's getting better, but it's not ready.

Umm... I was a Windows power user for awhile... and on countless times I was forced to hand-edit the registry, as well as a number of other files.
Does that mean Windows (XP) isn't ready for the desktop?

Hardy recovers in graphical "safe mode" with the graphical config editor up if X ever breaks. Xorg.conf actually isn't even required more, you can even just delete the file and Hardy will work perfectly by regenerating a default config file automatically. This was actually true in Gutsy too.

Now... X doesn't generally fail like that - Ubuntu worked fine for non-technical users for years without this feature - but now your complaint isn't even a little bit valid anymore.

Quite frankly, I don't want to use the same operating system as someone who refuses to edit any configuration file...Leave Linux to the power users and the server market.

No. Leave *SOME* Linux distributions to power users and the server market. But Windows users have the right to an alternative.

The point isn't that a user refuses to edit any configuration file. The point is that the user SHOULDN'T HAVE to edit any configuration file in the first place! Not to mention recompiling packages, building your own rpm's, solve dependency problems, have to complain about drivers not working out of the box...

Since I moved to Linux half a year ago, I've had to do a lot of stuff that the ordinary user shouldn't have to. I would love to just click here and there, and WHILE STILL having options, not have to worry about messing around with the configuration.

Tell me, why the heck are you afraid of ordinary users? Musicians, artists, graphic designers, hardcore gamers... they want something that just works. What do you have against that, and what are you afraid of? If you don't want dumbed-down distributions, don't use them and keep your own distro! Linux uses the GPL license for a reason.

I don't mind using the same operating system than an elitist zealot uses - just not the same computer.

Since I'm trying to put myself in a more regular user position (partly to eventually move in-laws when their XP installation get broken), I got my own list of things that need work in order to be really competitive:Configuration: Yes, usually autodetection and GUI config work. Sometimes doesn't. The worst part is the case of X. Some Distros like Ubuntu trash X.org autodetection in order to use their own, inferior solution (then Debian doesn't include xorgcfg). That's stupid. Enhance the GUI, but keep the wo

Quite frankly, I don't want to use the same operating system as someone who refuses to edit any configuration file.

Marketing Linux to the average desktop is a bad idea. Leave Linux to the power users and the server market.

Just because I'm not afraid of editing a config file doesn't mean I want to. I like that in a modern Ubuntu distro I can get everything working with a minimal amount of fuss, and don't like the parts that don't work automagically so I have to go mucking about with config files.

You know what the best part about it is, though? The "it works automagically don't worry" part and the "oops didn't work but don't worry you can fix it with text-editor-fu" part live in perfect harmony. Linux is getting better in the usability department, without sacrificing its "power user" roots. I can't see anything to complain about.

If you want to be an elitist about it, go use Slackware, or any *BSD. You can still consider yourself superior to the poor slobs whose Linux distros don't require config file editing, for whatever that's worth.

Oh, and I may be a power user, but I'm also a gamer, and I want games that run natively on Linux. Besides a tiny subset of games, that's not happening until Linux is the average desktop.

As a power user, I would love for Linux to be mainstream. The more mainstream it gets, the more likely my video drivers are to work, and the more likely I am to have some decent games to play.

As a server administrator, I would love it if all of our developers ran Linux on their desktops. It's still possible to run into surprises deploying from Windows on their workstations (read: laptops) to Linux on the server.

Quite frankly, I don't want to use the same operating system as someone who refuses to edit any configuration file.

Here's the cool part: It's not up to you.

The thing is, Linux -- or, more generally, all open source software -- is for everything and everyone. If there's anyone who can't use it, or anything it can't yet do, that's just another problem to be fixed by anyone who has the time.

And no one can stop it. You can't make it into your 31337 high-school h4x0r club anymore. It's much bigger than that, now.

Configuring a ten button mouse is a seriously edge case scenario. If you are using these sort of issue to differentiate what a "mainstream OS" is and isn't then you are shooting way over the target. By definition a mainstream OS is one that hits solidly in the middle of the user base's needs. To that end you'll find that out of the box Ubuntu support for 98% of pointing devices is not only there and quite capable, but actually exceeds what is offered out of the box on windows. For instance touchpad devices

I'd say that distributions like Ubuntu is exactly as user-friendly as OS X.If you use supported hardware and don't want to customize the OS in non-supported ways, everything just works.

Trying to use OS X on badly supported hardware? Needs system-file tinkering and thorough knowledge of how the system works.Trying to use Ubuntu on badly supported hardware? Needs system-file tinkering and thorough knowledge of how the system works.

The biggest difference is that Ubuntu usually isn't bundled together with 100% compatible hardware like OS X and, most of the time, Windows are.

To get a "apple to apple" comparison between operating systems you'd have to compare how easy they are to install and run on hardware that is 100% supported by the OS out of the box.Or the other way around, compare them on hardware that isn't supported out of the box. =)

I'd say that distributions like Ubuntu is exactly as user-friendly as OS X. If you use supported hardware and don't want to customize the OS in non-supported ways, everything just works.

I disagree. First let me say, I use both systems on the desktop daily (and have both in front of me right now). I also have formal education in and have worked in the field of user interface design and usability testing for disparate systems over several years.

I agree that not having the hardware vendor polish an install for their system is a huge source of usability issues. Most users never install an OS, and if you give someone a pre-configured system with OS X or Linux, you've solved a lot of their problems already.

That being the case, however, Linux still has some significant usability issues for many many, workflows and tasks. Linux is outstandingly usable for super-power users who need/want to create highly customized and specialized workflows and are not afraid of learning new interfaces. Linux is fairly usable for a very novice user who has a very limited number of tasks and workflows (Web, e-mail, word processing, playing CDs). It still has some interface issues, but it also has a few usability wins in this regard (such as at the task of keeping this core software up to date). They obviously have not, however, done the extensive usability testing Apple does, but they've hit most of the low hanging fruit for very novice users.

Linux has a lot of usability and interface issues when it comes to in between users. People who want to add new hardware (webcam, fancy trackball, stylus, braille board, or whatever) are more likely to have usability problems and not just because of lack of drivers. People who want to install and run software for specific more advanced uses such as: video editing, audio recording/mixing, 3D and vector graphics, publishing, or most commercial software like big games and other payware, still have significant usability problems. People still have significant problems trying to perform some common, but advanced tasks: creating a restricted user account for guests, migrating an installed system to new hardware, or sending a friend some software you have installed (but which is not in the repository), or enabling more advanced user interface features.

In short I understand and agree with your point about hardware, but I disagree in general about Linux being as usable as OS X for the gamut of end user tasks. I don't think any Linux on the desktop developer invests significantly in usability testing (based upon their resulting products) and I don't think they will catch the last 20% or so of problems until they do. I don't think they've even done enough work to address some of the fairly obvious problems that you can find and correct without such testing.

Before you take on the elitist attitude, you may notice that I put "plists" in the original text. I've seen these corrupt hundreds of times, with the leading cause being the same as many other problems - improper shutdowns.In addition, this can happen in a variety of other situations, and a quick perusal of apple's docs confirms as much. Indeed, a search for "terminal" also reveals many cases where one has to drop to a shell in OSX.While we're on the subject, I should also note that second only to windows,

lol.. haven't heard of that one. I wasn't actually trolling above, I'm looking forward to trying out the new version of Ubuntu, but the name is.. rather unfortunate? How can you expect anyone to get their friends to let them install 'hardy heron' on their machine?

The Linux software ecosystem is rife with applications that perform the same task as their popular proprietary counterparts. Some of them aren't quite up to par (Gimp), some are roughly equivalent (OpenOffice), and some are leagues better (Firefox). There are more and more proprietary applications being ported to Linux all the time.

If your argument is that there are specific software packages that can't run on Linux, well, the same is true for both Windows and Mac. There are many Mac applications that you simply can't buy for Windows and we all well know that the reverse is true.

Neither Mac or Windows come with a system where you can browse from a catalog of over 10,000 applications and install any one of them instantly, for free, with the click of a mouse button.

Want to buy new hardware... well you can if you scour the internet for days finding out if it's compatible; you can't just pop down pcworld one saturday afternoon and pick something up and know it'll work.

This hardware myth really needs to be put to rest. Linux supports a wider variety of hardware [lwn.net] than any other operating system on the planet. True, there can be a delay between the time that a new device is released and the time that a common Linux distribution supports it. It's also true that some hardware vendors refuse to release their hardware specifications or even cooperate in any way with open source developers but these are very much the exception these days rather than the rule. If you think Windows supports hardware any better than Linux then you have either not used Vista yet or have somehow managed to be the only person on the planet who has never fought with Windows over printer, video, or wifi driver issues at some point.

Want to install some software... sure... if you broadband no problem...

Ubuntu and many of its derivatives will ship you a copy of their OS on CD at no charge. No media fees, no shipping and handling. Free. Most of the software that you can install afterward is not at all too large to pull down via a dialup modem. Windows and OS X cost hundreds of dollars each. I would say that I put my money where my mouth is, except that I don't have to spend any of it on Linux at all.

oh, but it might install the software anywhere on your system... good luck learning to grep it.

Not sure what you mean here. On KDE- and GNOME-based distributions, a shortcut to every installed application gets put into the applications menu. Which, by the way, is sorted by the software's function so everything is easy to find. Contrast with Windows where each application goes into its own folder or a folder named after the company that distributed it. Install enough applications and the Start menu becomes large and unusable. Contrast also with Mac, where you have to dig down into a special (and also unsorted) Applications folder to find newly-installed apps.

Fat chance if your friend has just given you a cdrom with software on it!

Why, you don't have any friends?

Okay, unprofessional personal attack aside, Linux-using friends are more likely to give you a URL than a CD-ROM. If someone's giving you a CD-ROM with Windows or Mac software on it, there's a good chance it's warez anyway unless they're in the habit of giving away their legitimate software.

There is, admittedly, a noted lack of high-profile games natively available for Linux. However, there are some good ones [linuxgames.com] available. Recent versions of Quake and Unreal Tournament run fine natively.

But, unfortunately, it's far from perfect. Ubuntu is and has been good enough for my completely non-computer-literate roommate to use when the system is up and running. But there's no way he could have gotten the wireless working on his own (even in the 8.04 beta, I still had to download and install drivers, then muck around with/etc/networking/interfaces file to make it work).

Now hold on a second. Would your friend have been able to get wireless working in Windows if the driver didn't automatically install? It frequently doesn't, you know? I can't count the number of times I've done a clean XP install, and had it fail to install sound drivers, video drivers, ethernet controller drivers, or wireless drivers. (But it does helpfully offer to look on the internet for such drivers. How it plans to do this with no connectivity is anyone's guess.)

Every time this happens -- which is often enough to be annoying -- I have to go hunt down individual drivers from individual manufacturer's websites, since half of them seem to need to be propietary to work at all (the generic Broadcom driver for a Dell laptop, for example, would not install, but the one from Dell's site did). Then I have to burn them to CD, take them to the afflicted machine, and load them that way.

Ironically I usually end up doing this from my Ubuntu laptop, where everything -- absolutely everything -- worked out of the box. Even on Broadcom chipsets, the only thing I've ever had trouble with in the past when it came to Linux, Ubuntu just threw a message box that said something like "Check this box to enable the restricted wireless driver," and presto.

My point, I guess, is that I've never understood why people criticize Linux because Your Mom wouldn't know what to do if something goes awry. While true, it isn't like Your Mom knows what to do when things go awry with Windows either, so what's the difference?

Vista is better about drivers, yes, but in my experience it's still behind Ubuntu. Especially when, god forbid, your driver isn't Digitally Signed and Certified by Microsoft, at which point Vista just refuses to install it. But out of the box, yeah, it handles most of my hardware pretty well. Not as well as Ubuntu has, though.

As for your other point, yes, a computer to the average person is a box with useful programs. In that light, what do you get on a fresh Windows install? Practically nothing -- a

(Raises hand)Umm,no, that's not actually true. I haven't had a commercial off the shelf copy of Windows XP supply all of the drivers for any PC that I built in I don't know how long.

You see, the latest release of Windows XP that you can buy off the shelf is SP2. That most certainly does _not_ come with all the drivers you need for a new system. At most you'll get some generic drivers from peripheral manufacturers that will be several years out of date.

I would say it's quite possible, but until Ubuntu got something like widespread availability as a pre-installed on computers for purchase, then it won't matter how ready it is because few people in the masses will have any experience.

Right now, with a few exceptions, it's the geeks advertising it to others. There's not enough of us really to make an impact (and not all of us are evangelists). Ubuntu or an equally-suitable disto NEEDS to be pre-installed on a larger number of machines than we currently have. Simple.

I run a Gentoo workstation for work, where I set up things exactly the way I want them, but this is quite time consuming.

I also have a "media center" type box with ubuntu that the family uses to get and display multimedia content. This box is almost maintenance free, no virus, no problems. A Windows machine would have given me a lot more work and it would have turned me into a pirate:-)

Normal people don't install operating systems, they buy a machine in a box at the computer shop. While I agree that Ubuntu is the distribution that is closest to being ready for mainstream desktops, it has to get pre-installed on those machines in order to really break into the mainstream market. So far, it hasn't. Dell went with Ubuntu, but they aren't exactly pushing their Linux offerings. Asus chose Xandros for their Linux machine. HP have chosen Suse (Novell). Their machines are or will be on sale at the local computer shop. I don't think it's any coincidence that both those companies signed patent agreements with Microsoft. I imagine Microsoft's legal team can be pretty scary if 99% of your business is based on selling hardware to run their software.

Some distributions do provide that, such as Linux Mint. Also, the Dell machines that ship with Ubuntu include DVD support, and mp3 support with Ubuntu is just a mouse click or two away when you try to play your first mp3 file.

As much as i hate games, and hate to admit it, until you can go down the street to your local big box store, buy a game and it 'just work', its not ready for "the masses". "the masses" want to surf porn, buy stuff from ebay and play their stilly computer games.

For actual useful work, in a company with an IT staff, Linux and BSD have been ready for a while now.

It depends on which masses you refer to. Linux covers about 90% of the Windows world, and it's definitely the most importnat 90%. People can and do switch desktops to Linux. Maybe not as often as you'd like, but they do it.

The problem is that the other 10% is crap like Clippy and Activex that no one on Linux wants to have or implement, but makes a certain number of computer users more comfortable. Windows does so much hand-holding by default, and that's one of the things Linux users hate about it. But it's necessary for a number of people who can never remember the difference between business and friendly letters or for people who are to afraid to even click Settings... let alone dick around with it a bit.

It doesn't help that Linux is mostly marketed by the community as being "Almost-Windows" or "Free Windows", instead of as a product that stands on its own.

People have said as a joke that OpenOffice.org or similar programs will take over once they have their own clippy, but may a true word is said in jest.

Ubuntu gets better with each release. When I first put Dapper on my Toshiba laptop, I had to fiddle around with the boot menu to get it to work correctly, and I had to remember to do this everytime a new kernel was installed, otherwise the laptop would stuff up on its next reboot. Subsequent releases didn't require this switch though.

The BIGGEST fix they've provided (and I'm sure everyone agrees with me on this) is the failsafe mode if X screws up. Who remembers about a year ago when XServer was updated and it killed the desktop? They quickly remedied the situation but for a lot of people I imagine that it either made them reinstall or switch back to Windows. Luckily I managed to downgrade my version because I hadn't cleaned out my archives in a while.

That's just a lack of realism on Linus' part, then. Anyone who's worked more than a few months in IT can tell you that not only are users stupid, they tend to be complete idiots. People REALLY DO need that much hand-holding, and while I don't like it, I can at least accept it.

The masses will accept nearly anything put in front of them which is intuitive enough, and familiar enough, for them to comprehend. Eventually, Linux will take over. When it takes over is up to the hardware manufacturers.

This has two components. When the OEMs gather up enough courage to escape Microsoft's shackles, and when the device makers decide that developing open drivers is worth their time, Linux will flourish. Until then, every year will continue to be the "year of the Linux desktop". How many of these are we up to, 12?

The two main culprits right now are Dell and Nvidia. Dell needs to release the sales numbers of their Linux desktop systems, and Nvidia needs to abandon their binary-only driver approach.

1. - "Why install Ubuntu when I can just use Windows for free?"Note that by "free," I'm referring to the presumption that it was free with the purchase of a PC, not infringing copies.

This is why IE won the browser wars. Before the integration of IE4, Web browsers either had to be installed manually or were provided by the OEM. The OEMs usually bundled Netscape. Microsoft integrated IE into Windows and changed the OEM licensing so that Netscape-bundling OEMs were punished. You could still download Netscape manually, but why would you want to? Most non-nerds don't care about the browser but rather whether or not it is there at all. It is nothing short of a miracle that Firefox campaigns have been succeeding in getting ordinary folk to install and use Firefox over IE, especially after IE7 came out.

2. - "Windows is just fine. Why bother switching?"This one is all too familiar to Mac evangelists as well as free OS advocates. This, along with ridiculous prices, is what keeps Apple in the minority. My statement about browsers applies equally to operating systems: people just don't care. They will most likely choose whatever runs what they need at the cheapest price. Ubuntu and other distributions have gone a long way in fixing this, but in order to "convert" someone you would not only need to get them to install Ubuntu but also get them to use Firefox instead of IE, OpenOffice.org instead of Microsoft Office, GIMP instead of Photoshop, Thunderbird instead of Outlook, etc. Yes, you can run most of this stuff in WINE, but the experience is so much smoother with native apps, and users will notice this quickly. Additionally, if everything they run is just run in WINE, there isn't really much of a point, from their perspective, of running Ubuntu over Windows. Windows gives them better compatibility than WINE and is already bundled by almost all OEMs. Might as well stay with Windows.

Yes, but no more so than Mandriva 2008.1. I installed it this past weekend and it is about as slick as I have seen any Linux installation thus far. Everything just "works", and works well. It is gorgeous, fast, easy to use, seamlessly knit together, simple to update, loaded with helpful admin tools, and full of packages.

It is nice to know there are many decent choices for a high quality Linux desktop experience!

In no way do I want to disparage the efforts of all people working on various Linux distributions - especially not Ubuntu, who have probably put in more than anyone in recent times - but it seems to me that the mob that has done the most to bring Linux to the masses is Asus with their eeePC laptop.

1) They've put it on a desirable, useful, practical, cheap ultra-portable laptop that people want for its size and neat-ness (and low cost)

2) They've made it simple to use and focused on the core applications and best parts of Linux

3) They've made it open source (well, maybe not by choice) and accessible for developers

4) They've solid millions of them, in a single stroke bringing Linux-to-the-desktop to more users than (I would guess?) ever before.

5) Probably most importantly, they've scared the living SHIT out of Microsoft who are now scurrying around trying to get a lightweight version of XP together to match it, which is almost 100% the opposite of what they're trying to do everywhere else (ie, make people buy Vista).

This is a very telling remark, mostly because it's been around for a decade.

When Linux kernel 2.0 came out, it was "ready for primetime," and the only people who said otherwise were trotting out complaints that were fixed in the bad old days.2.2 kernel, same thing. 2.4, again. People who might be half-interested in trying Linux are more than a little leery partly because the community has been saying "it's finally ready for you now--we've fixed all of those bad things you've heard" for half a generation!

Is Ubuntu ready for the consumer? Yep, I'd say so--I installed it for a friend, and he loves it. That doesn't change the fact that people are suspicious of apologies about "previous" problems.

You can't ask newbies to install device drivers or recompile the kernel

You know, I remember a time when casual computer users used to make special boot floppies with special memory configurations just to play games. End-users can cope just fine with complexity. Linux hasn't been too complicated for at least a decade.

Now you can argue that Linux is more complicated than the competition, and that users prefer the least complicated options, but that's not the same thing as saying that Linux is too complicated. "Too complicated" means that end-users would be unable to use Linux even if it were the only option. That hasn't been true for a very long time.

And come on, average end-users don't have to recompile the kernel anyway. That's a stupid stereotype that brainless pundits say reflexively. Installing device drivers? Last time I checked, other systems need users to install drivers too.

I'm typing this on an ASUS EEE PC and loving it. All my linux-centric frustrations seems to be unable to happen on this tiny machine. Guess it doesn't support it.:)

Want my suggestion? Go for more generic names in the apps. In Windows, it's "add/remove programs". In Linux, the closest thing I can think of is the oddly-named "synaptic". If you tell grandma to run "synaptic" to install something, it just creates more confusion.

Stop prefixing things with "K" just because it's for KDE or whatever. Stop with the ultra-shortened names for full-blown applications, with 3-4 decimal points for versions.

Don't tread into copyright infringement with exact names for things, but moreso something a bit more streamlined. "GIMP" is guilty of over-acronymizing(with a recursive acronym in the acronym), and just sounds goofy. Perhaps a tiny bit of marketing at least on the app names will help things a bit.

No, no, no. Did OS X work perfectly on this random Dell that you tried to install Hardy on?

Seriously. When you first started using OS X, you bought a new machine that was specifically built to run that OS. Comparing that experience to trying to install Ubuntu on random hardware is absurd. If you want to compare your OS X experience to anything, compare it to a Dell with Ubuntu pre-installed.

Now is this the fault of the developers or of the hardware manufacturers? It's the Hardware guys, IMHO, because there is a huge lack of decent drivers for the important hardware. I'm looking at you, ATI. Not to mention, all the 'no name' (or 'cheap') hardware out there (That comes bundled in low-end machines) rarely has Linux drivers. Its either the manufacture does not have enough resources to pump out a Linux driver or they see the Linux community as too small and insignificant to even bother.

YES!!! Hypothetical: you're new to Linux. Someone tells you to open up a "terminal". OK.... You don't know what a terminal is but it resembles that command line thingy you've seen once or twice (and reminds you of the hackers at the movies). Now you see "sudo". wtf?? apt-get... wtf again. This kind of banter of "just type in garbel garbel garbel" just helps keep our operating system exclusive to us. While I don't think this year is "the year" (has it ever been?), Ubuntu has definitely made things w

It may seem easy to you when you've been learning about computers for years.. it's really hard to get into the mindset of someone who doesn't know ANYTHING about computers - I'd probably say impossible in my case, because there are so many things that I just take for granted as I've been learning for 2 decades since I was four. Computers are actually pretty complicated:P Actually installing drivers these days is the same as installing any other piece of software really, but it has the potential to seriousl

I run one monitor at 1280x800, the other at 1280x1024. Admittedly it took some minor edits to xorg.conf, and a five line script to switch between single screen and dual screen, but it's certainly possible. Here [blogspot.com] is a brief tutorial.

This is one of the areas that Ubuntu has the most room for improvement. I'm hoping that Hardy will resolve some of the problems.