Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "Is Linux being held back by distributions bent on competing with Microsoft Windows? This article argues that it's a real possibility. Quoting: '... what was apparent early on during my Linux adoption was my motivation for making the switch in the first place — no longer wanting to use Windows. This is where I think the confusion begins for most new Linux adopters. As we make the switch, we must fight the inherent urge to automatically begin comparing the new desktop experience to our previous experiences with Windows. It's a completely different set of circumstances, folks. ... The fact that one platform can support a specific device while the other platform cannot (and so on) doesn't really solve the problem of getting said device working. You can see where this dysfunction of thought can become a big problem, fast."

Linux has a 90% share in supercomputers, a 50% share in servers (+/- 10%), and a pretty good share of cell phones and other mobiles, if you include Android and other semi-proprietary systems. The only place to expand into it the desktop, where the market share is at most 5%. So, why not?

I half agree. Linux does not have to be "like Windows" to be suitable as a Desktop OS. It does however help people make the transition, and it could certainly use the market share in order to influence driver developers and video game developers to think of Linux. There is something to be said for keeping the things that make Linux lovers love it, but this is the beauty of having hundreds of distributions.

While I am more techy than most of the people I worky with (Hence I am sitting here reading this at work) most of the folks around me look at PCs simply as a tool. Can't teach them new tricks? Bollocks. A lot of my time is spent working with business teams who are looking to improve their way of doing business and teaching them about how different programs can be used to get the information they want.

Want to find your current sales trends in a way that you haven't been able to before? Okay, well, we have the data in this thing called Datawarehouse. Our reporting team will be able to provide you a set of reports, but they take a long time to develop and check. If you want to do some quick nasty analysis to fend off a crisis, there is a program called TOAD that will let you directly query your data. Look difficult? Lets go through how it works and how you write a SQL query.

Result: In the last Two years, I have introduced around 100 users who are NOT tech savvy at all to the wonders of SQL queries. They are now in various stages of competence, but they are using new things.

My (belated) point here is that while something like Toad (or now replace with Linux) isn't something that they can just pick up and run with, if people see a benefit to it, they WILL make the effort to learn how to use it.

In my mind, Linux really needs to advertise the benefits it has to the ordinary person so that they are enticed to make the effort to learn how to use it. Having said that, the easier it makes this learning process, the less advertising it has to do.

No, it's exactly what's right. Linux is not ever going to have a "one-true distro", no matter how much you demand it.

If that means that 'ordinary' people aren't going to use it then I can't say it bothers me, not in the slightest.

Hell, 'normal' people aren't even going to install a new OS on their computer, ever. In a lot of ways that makes this discussion completely irrelevant as the people who need to be persuaded are manufacturers and distributors, not users. If the likes of Dell started to offer something like Ubuntu as a Windows alternative across a decent proportion of its range (instead of offering only a few, generally pretty poor machines) then that would help adoption I suppose.

But as I say, it's kind of irrelevant. Desktop linux is awesome for my needs and somehow development has struggled on and improved for 15 or so years.

Uh, what? If the GUI is just a fancy, specialized program for editing the various dotfiles and stuff crammed in/etc, then it does no harm to the person who actually likes messing around with baretext config files.

- Abandonment of 99% of the distros

Why abandon them? Call Ubuntu "Linux Home", Debian "Linux Professional", and "[favorite distro here]" "Linux Ultimate". There's no need to eliminate pro-friendly distros - that's the beauty of it. You just make a new one that caters to the beginners, and let it take care of that market. The Roadrunner doesn't run the same distro as the Droid, to put it poetically.

- Acceptance of proprietary drivers when offered (normal people don't give a damn about open source philosophy)

I believe in open-source, not because it is ethically mandated, but because it produces better results. As such, I expect that, eventually, open-source drivers will be better than the proprietary ones, at which point the natural choice would be to use them. Whether the manufacturers choose to assist the open-source team is up to them.

- Provision of real, available, phone-based technical support

I fail to see how this is a negative. At the very least, we get a scapegoat to point the boss at while we go fix the actual problem.

- Real, complete documentation

Again, how the hell is that a bad thing? I have NEVER heard someone say, "This is great and all, but I really wish the documentation was shoddy, incomplete and half written in Spanish." I mean, look at OpenBSD - plenty of detailed man pages, yet it's a very pro-oriented OS.

I have seen someone mocked for buying one package when some pinhead thought another would be more appropriate for the application. It was something like, "Well, what did you expect picking that? It's like you wanted to fail." Most people here have seen PLENTY of derision of new users.

Open-source is actually quite newb-friendly. I, being a fool, started my open-source experience with OpenBSD. I couldn't figure out how to mount my USB drive - a quick email, and I got a kind response from Theo de Raadt, the "benevolent dictator" of OpenBSD, telling me what I needed to do. Despite the Weird Al song, it is completely impossible to phone Bill Gates up at home and make him do your tech support.

Why not? Because a lot of the community is poison for end users. That's why not.

You see it as poison, I see it as potential. There's things you can learn from closed-source people. Game developers know quite a lot about squeezing performance out of hardware - that would be beneficial. Windows application developers are used to following a standardized interface - that would be nice, as well. There is always something to be learned from everyone.

I see nothing wrong with being a better Microsoft. Arguably, Linux is the Microsoft of the open-source world - you can't get anywhere with your project unless it runs on Linux, it's squeezed out a good chunk of the other open-source OSes, and it's pretty much mandatory for open-source admins to know Linux.

Or, remain "pure", disjointed, and niche on the desktop. Rule the world from the server. Personally I think linux should abandon the desktop. By the time they get there, technology will have made the point moot.

If we don't spread Linux to the desktop, we'll be supporting Windows clients until we do spread Linux to the desktop. Is that really what you want?

Uh, what? If the GUI is just a fancy, specialized program for editing the various dotfiles and stuff crammed in/etc, then it does no harm to the person who actually likes messing around with baretext config files.

Programmers usually make bad GUI designers.

Usually, the interface should depend on what type of user it is targeted at. If the intended user is a professional, the interface should allow him to customize the program as much as possible. If the intended user is a regular user, the interface should be simpler and more explained. Compare a tape deck made for studio use and one made for home use. The studio one has much more functions and capabilities that a professional can use, but they would just confuse home users. The home user usually would not care about bias, eq, tape tension and stuff like that, they would just want to put on the tape and play/record it.

Another example would be the BIOS setup - what does "Gate A20 - Slow|Fast" mean and why would I ever want to set it to slow? But that setup is intended for those who know what they are doing and not a regular user.

Programmers make interfaces for themselves and other programmers, which means that they suck for regular users.

I believe in open-source, not because it is ethically mandated, but because it produces better results. As such, I expect that, eventually, open-source drivers will be better than the proprietary ones, at which point the natural choice would be to use them.

And if/when the open source drivers are created and are better than the proprietary drivers, I'll use them. For now it boils down to "use proprietary drivers" or "not use the device".

I, as a non-programmer do not care about openness of the source, since I would not be able to modify and recompile the driver even i the source was available. I can get the same result if I modified the binary using a hex editor - that is - a no longer working program. I don't care if the source is open, closed or the company makes electricity by burning penguins - if the end product is good and I like the price I'll use it.

And who is to do this? Can you call Microsoft to get help with your problems, without being IT head of a big company having big contracts? I have never heard of anyone being able to do so. Support always comes from the community: friends, family, and even the shop they bought the computer from. But not from the maker.

- Real, complete documentation

Admittedly I have never really dived into Windows documentation, but the "trouble shooting" wizards have never been helpful for me.

And if you're thinking of documentation of applications... I bet it's as bad for Windows as it is for Linux as it's the developer (person or company) that has to make it!

I had to run a control panel from the command line using sudo in order to make it keep my dual monitor preferences as recently as last year. Of course it didn't tell me that... it just reset to single monitor mode each reboot.

I'd say more (lots of fun with that distro - gave up after 6 weeks), but that's enough. It's working as designed, and broken for end users.

I had to run a control panel from the command line using sudo in order to make it keep my dual monitor preferences as recently as last year.

As recently as last year.... So one year ago (it's November). That's at least two versions back, maybe three. You should try it again. I am especially impressed with latest 10.10. I wasn't sure if I would like it, but I do. There are always a few bugs...sound in particular was annoying while all the Pulseaudio nonsense was being sorted out. But I haven't had a problem yet with the latest version. I don't have dual monitors, so can't vouch for that.

Also, "has a few rough spots" does not equate to "broken for end users." I've installed Ubuntu for plenty of people, and yes there have been occasional hiccups that I've had to help them fix, but completely usable. They're not going to delete their Windows partition any time soon, but they are happy booting into and using Ubuntu for various things.

The problem I have with comparing Linux to Windows on the desktop is that I think Windows stinks on the desktop. I may be in the minority, but I want an operating system that is lean and mean, with no zooming windows, special effects, cute audio cues, or glassy curved "kewl" surfaces. I want an operating system to run applications.

I have become frustrated with Linux on the desktop because there is a rush to beat Windows at what it is best at: bloat. The average Windows or Linux install starts with all

can be anything we want it to be. It is, after all, open source and can be modified to suit many different purposes. Should Linux compete directly with Windows? That's a stupid question. Linux should do what the user wants and if that happens to put it on a collision course with Windows then so be it.

Agreed, but not every user has the time to spend customizing every aspect of the OS and each application. I share the author's frustration at a "Linux experience" that keeps trying to be Windows-like and ends up feeling like a cheap knockoff. Windows sucks, and most applications written for Windows suck, and everyone knows it; it's the search for a better alternative that drives most users away from Microsoft's smothering embrace out into the wild world of F/OSS in the first place. So why is it so damned

There is hardly a soul on this planet who's life is not touched by linux in some fashion every single day. Windows has another chunk taken out of it every day it is death by a thousand cuts. If things continue on the path they currently are nearly everyone is going to be running around with linux in their pocket and soon. I saw a guy today with a droid in one hand and a kindle in the other, now that brought a smile to my face.

You're so right. The desktop is moving towards being obsolete -- a work thing. Why should Linux care with the juggernaut Android crushing MS in the real world? Developers, don't even think about the desktop, focus on the phone and the coming andro-pad.

Obsolete? Yeah, maybe when smartphones start coming with a 19" screen. Maybe when net/notebooks get a keyboard that's not like typing on chicklets and add a side-tray for a mouse. Maybe when I can upgrade most of the parts in either rather than having to buy a new one.

Desktops may not be the only option anymore but they're a hell of a long way from obsolete.

I think the GP may have been using a little known literary device called hyperbole.

Besides, Linux is so pervasive now that many people who have never used a computer benefit from Linux indirectly. If they've ever hooked up to the electric grid, or used a phone, or bought anything that wasn't produced locally, they have likely used services that rely on Linux. I think it would be difficult at this point to avoid being "touched by Linux" in some way without being completely isolated from the global economy.

Linux must compete with Windows if there is ever going to be a "year of Linux on the desktop."

That would force manufacturers to release more compatible products, perhaps even contributing drivers to the kernel. It would spur the release of more commercial software, and gather more interest in the open source software that already exists as well as fostering new growth there.

Computers would be cheaper, as there wouldn't be a Windows tax, and additionally there would be more form factors available. How about ARM laptops with 30-40 hour battery life? Oh, sorry, that's not really happening now because manufacturers are afraid their customers will be confused, and they are afraid of losing their partnering bribes - I mean "incentives" with Microsoft.

Linux on the desktop, from the store, for average people, with first-party support, is extremely desirable for the future of computing. One thing that would be nice is to see some Linux games. Oh sure, you can run Wine or one of the commercial variants of Wine, but most people are just going to stick with Windows.

you do realize that switching to ARM laptops would fuck up a lot more software than the OS right? also the laptop would be fucking expensive because the ARM architecture doesn't have a shitload of manufacturers developing pc peripherals for it (there's a reason apple switch away from ppc).

also a good majority of manufacturers are contributing to linux drivers, whether it's actual drivers or just specs so someone else can write the drivers.

If the objective is to be a desktop OS that everyone can use then yes you are defacto competing with Windows. That doesn't mean doing everything just like Windows does but it does mean competing.

Also if you want to compete EFFECTIVELY it does mean trying to do the things that Windows can do. That doesn't mean looking or acting precisely the same, but it means being able to handle the same kinds of tasks with the same (or better yet less) effort.

Remember that to most people computers are tools. They have various things they want to accomplish with them, and they want the tool to be easy and helpful in doing that. As such, to win them over you need to be able to accomplish their tasks, and to do so with a minimum of fuss.

Expecting people to be willing to troubleshoot and learn more about Linux is complete bullshit. It is effectively being lazy, it is saying "We can't make our shit work right or be easy to use, so we expect you to pick up the slack and learn to deal with it." That is NOT an acceptable solution, because the response from people will be "Fuck you, I'm not using it then." They don't want to become experts in computers, they just want to use them to accomplish whatever it is they are after.

It is no coincidence that as computers have gotten easier to use, more people use them. Back when computers were first invented not only were they expensive, but you practically needed an advanced degree to operate them. You had to program them in raw machine code, every program was something newly created, you had to solve electrical problems, etc, etc. There were just few people that could deal with that. As things got successively easier, more friendly, the world of computing was opened to more people.

Now it is fine to feel Linux shouldn't go the desktop route, that it should be a server/embedded OS and desktop use should be primarily incidental. However if you want it to flourish in the desktop market then that means it does have to compete with Windows and it does have to get easy to use. "Recompile your kernel," are words that must utterly vanish from any normal kind of support, source code is something a user can't be aware of needing, the command line should be for experts only, and so on.

To try and think otherwise is not only arrogant, but myopic. You only have to look at the world to realize the vast complexities of things out there, and how much we must all specialize. To decide that computers are the one special thing that everyone should want to become interested and expert in is silly.

Also if you want to compete EFFECTIVELY it does mean trying to do the things that Windows can do.

"The things Windows can do" are things that pretty much every OS+UI been able to do for damn near twenty years. There's nothing magical there, and yes, obviously any desktop OS needs to be able to do those things. The problem is that a lot of people working on Linux distros and software seem to have the idea that "competing effectively" means copying, rather than trying to find a better way to do things.

Look, nobody will ever be as good (or bad) at being Microsoft as Microsoft is. Try to make your UI look like Windows, or your word processor look like Word, and you're not going to fool anyone. Most users aren't going to be impressed at what a great job you've done reverse-engineering Microsoft's crappy standards. They're just going to say, "Why should I go with a knockoff when the original comes free* with my computer?" Chasing anyone's tail, in any industry, is usually a losing proposition. Chasing the tail of a lame, half-blind, diarrhetic horse just means you don't get anywhere very fast and end up covered in shit.

*Yeah, I know. From a marketing perspective, the "Windows tax" makes no difference at all to the vast majority of computer buyers. Deal with it.

Right on brother! I've just spent about 18 months using Linux almost exclusively, (there are a few things that I can't run even under Wine or VirtualBox), and I'm now preparing to return to Windows. I hate Micro$oft, I love the idea of Linux and FOSS, and yet I'm going back to the evil empire. Why?

First I should explain that I'm quite capable of using the CLI to issue commands, configure stuff, etc. And I've successfully edited more config files than I really wanted to, (often piecing together bits of info from the web because I couldn't find all the relevant info in one place). The point being that I'm not a technophobe or a dufus. I'm primarily a hardware designer, but I've written some software, I've used computers heavily since DOS 3.0, and I'm a fairly sophisticated user. But, I really DON'T WANT TO SPEND MY LIFE figuring out why Wine doesn't work any more, or figuring out a workaround for the fact that the structure of CUPS doesn't allow cups-pdf to give me the opportunity to specify my own filename and destination directory on-the-fly. I don't want to waste my time launching a separate app to search for files because Nautilus doesn't have an integrated search function, only to find that the search program doesn't allow me to change file properties. I don't want to waste time installing Dolphin with all its aesthetic ugliness and K-bloat in order to have a decent file manager, only to discover that Dolphin doesn't do partial filename searches and doesn't TELL me that it can't do them. I don't want to have to chase around my system trying to find icons to reassociate with binaries because an update broke the associations somehow.

And I could go on and on in this vein, but I think I've made my point. I use my computer largely for work, and the more time I spend trying to make it functional, the less time I have for either work or recreation. A little bit of dicking around with my computer is fun and educational, (and in fact I did a lot more than 'a little bit' when I first adopted Linux), but beyond that it just gets tiresome and frustrating. I'm much more interested in doing things WITH my computer than I am in doing things TO my computer. When I first started using computers, they were fascinating in and of themselves. Now I want them to be like my car; know a little bit about how they work and how to fix them, expect to do some maintenance and repairs occasionally, but mostly just hop in and drive without a second thought. And as frustrating and far-from-perfect as I've found Windows to be, in my experience it's a lot closer to that ideal than Linux is.

Competing with Windows for customers ranges somewhere between silly and stupid. If you want more Linux on the desktop, you need to court developers and software vendors.

Linux works great as an OS. It has penetrated servers well because the server software (both new and inherited from other Unixes) is great. It has penetrated the embedded market largely because new apps were written for it and the new devices. It has penetrated embedded markets because they write everything they need anyway, except the kernel and maybe the C libraries give them a head start.

What you need to break into the desktop market with established applications from established application providers is applications as good or better. If you give gamers the chance to install games from EA, Valve, Blizzard, Bioware, and id on launch day, they will come. If you get Photoshop or some absolutely full-featured replacement for it on Linux, you'll get many of those users from Windows or Mac. If you get a true replacement for Peachtree and Quickbooks, you'll get more small businesses using Linux as their accounting desktops.

People who seem to understand network effects when it comes to social networking sites, instant messengers, P2P, etc. seem to forget all about them when it comes to desktop platforms. The more classes of application in which your platform is the leading installation target for the best apps, the more valuable your platform is. Linux has this for servers, embedded devices, and to some degree mobiles. If you want it to be a major desktop player, it needs this for desktops, too.

Personally, I use Linux on the desktop far more than Windows and I have for years. I still need some Windows or Mac systems around for the applications I just can't run well on Linux. I say "Windows or Mac" because most of the applications I can't run on Linux properly have versions for both of those platforms.

Linux doesn't even need to take developers from Windows to become much bigger on the desktop. It could become a third platform for companies supporting Win and OS X. It could become a second platform for companies doing Win or Mac. It could even replace OS X as the second platform for some software companies that do windows and Mac now. Adobe comes to mind, as they are practically at war with Apple right now anyway.

> Competing with Windows for customers ranges somewhere between silly and stupid. If> you want more Linux on the desktop, you need to court developers and software vendors.

Nope. If you want more users you need preloads. 90% of people would never survive a Windows install if it didn't come preloaded by an OEM who did all the twiddling to have the hardware mostly work out of the box. Anaconda actually does a better job compared to the Windows installer as far as leaving you a working machine when it finishes. Doesn't matter because end users can't use either one and refuse to even consider the possibility.

And that isn't a matter of techinical excellence, software availability or anything competition can address. It all about illegal monopolistic action. Microsoft signs consent decree after consent decree and over a decade after their first one you still can't buy a desktop PC without Windows proloaded except for a couple of bland Dell N series machines that are usually priced higher than the same machine preloaded with Windows.

The netbook revolution almost opened up the market but Microsoft just dumped XP into the hole until they could convince the manufactures to kill em off in favor of small notebooks running Win7. Go ahead, try to find a small flash drive based cheap netbook. All you find is three pounders with hard drives, crappy battery life and screens just a smidge smaller than a small notebook... and all running WIndows.

You think? I think about 30% of people would never install their own OS. I think if it's easy (and it is), then about two thirds or so of people would be willing to install an OS.

I heard arguments like yours about browsers, too, but here we are looking at usage for non-preloaded browsers of around 50%.

Besides, I don't think your point retorts the OP's point. If Linux had lots of developers (and, actually, it does) then its software would become "good enough" (and, actually, it pretty much is) and then there

I think it's a mistake to pigeon hole Linux specifically for this type of question. A more pertinent question should be more about having an open source operating system alternative to Windows. There's no reason to use generic Linux for that specifically. There is definitely a reason to replace Windows with open source though.

Very interesting discussion. For a time I used Ubuntu 10.04 and finally I think there is a version for the average person. However, there is a problem. Myself and a bunch of other people have quite a bit of money and time sunk into windows programs. I've heard all the arguments and have used openoffice myself. It is pretty good! But it doesn't have absolutely 100% compatability with office and I don't have time to play around with that unless it works right with word, excel, etc formats perfectly ever

Why do people keep thinking that Linux a a cheap, or free or open or whatever replacement of Windows. It isn't.And you can't copy Windows. That would mean that you have to wait till Windows does something.http://linux.oneandoneis2.org/LNW.htm [oneandoneis2.org]

Linux should go its own way and if that takes down Windows, it is a nice plus. Competing with Windows should not be a direction, bceause that will be a fight that you can only loose.

It competes with Windows. It replaced Windows for me, and for everyone in my family who wants computer "advice" from me. Whenever Linux does something really bad it's Windows I consider shifting to - then reconsider when I try Windows again. It's the only alternative to Windows in any business I've ever worked with or for (and that's a lot, all serious businesses, usually Fortune 500 or their ilk).

I agree that Linux should "go its own way". Linux has the zeitgeist, the momentum, the developers, the real wor

Desktops are stuck in a "desktop" paradigm, and so are going to be whatever they are now until they totally disappear sometime decades from now: Windows for most everyone, Macs for some specialties particularly in audiovisual production, and Linux for the very few in either the narrowest range of specialties or the narrowest band of all: those who use the best tool for the job at hand, regardless of what everyone else is using.

But the desktop is disappearing. "Mobile" computing is computing you don't have to notice computing. Especially as input leaves behind keyboards, as all displays are networked and shareable, the GUI will detach from the hardware, to be put anywhere the users want it to be, including merged together. More and more people will do what they do helped by "computers", but they won't be Windows. They'll be Android, or some other Linux variant. Because Windows is like a desktop, and most work is better done without a desktop.

It won't be Linux, either. Linux will have a place in the majority of servers, and there'll be a lot of them. But the "Internet of Things" needs something smaller than Windows, smaller than Linux. It's why even the Mac ditched the old MacOS and is now closely related to Linux, in that it's mostly a (mostly) open Unix variant.

Android is closing in on a majority of smartphones. Around the time it's the majority, all phones that do more than just talk will be smartphones. It's the software and uses of smartphones, and their closely related tablets, that will be what most humans use "computers" for most of the time. Everyone in a developed economy will have their mobile device that's their key to accessing all the people, things and info in their world. Windows will be stuck on desktops, where the first small segment of humans started using them. The rest of the world, most of it, will be using the descendants of Android in ways that Windows can never approximate.

I've been using computers since the C64 as a kid. I'm geeky enough to use Slashdot. I've used Linux on and off since Slackware 7"ish" (w/ all the version # skipping). Dabbled with some CS classes. I've used MS Dos . through all versions of Windows and used OS X for 4 years..... So I think I at least have some geek credentials to post this.

I mostly stopped playing games so I don't have much use for Windows. I've preferred to use OS X but didn't want to keep my Mac. OS X is genius it really "just" works. And I've spent far less time troubleshooting and resolving issues than I ever have with Windows or Linux. I've been trying REALLY hard to move over to a PC-based 'Nix based OS for a few years now but I'm finding it a bit hard.

I think I'm of the age, have the computer knowledge necessary and have the desire enough to switch that I'm a likely target user. You need some (somewhat)geeky people (like me:) ) for now to more readily adopt 'Nixes. Depending on what you do, Granny is probably ok to check e-mail with some KDE or Gnome based distro. I'm also finding it easier to automate and simplify some daily tasks with the command line (I use a lot of the reg-ex tools Sed, AWK and dabbling with Perl and Python - nothing fancy though. The Windows scripting and command line tools is an utterly and confusing mess, I won't touch it with a 10-foot pole. This *alone* has me as an easy convert.

Here's my beefs over the years which has prevented me from switching. I note over the years as I've not tried recently to install Slackware, Ubuntu, SUSE or FreeBSD (yes, I've tried a few) or such that it might be fixed now. Some of this might not be technically accurate. So at least, try to understand that this is a general overview. I'm not asking how to fix it, but rather these are probably some of the problems people have.

1) Drivers. Some things just don't work right out of the box. I haven't tried X.org in last year-or-so, but my ATI card has been a major PITA to get working. I've seen (too) many postings on "How do I get my trackpad working" or get this working. Recompiling the kernel is somewhat challenging if you have to get to that level. Choosing the wrong option or ommitting something can FOOBAR the kernel and you have to Google till you get it right. Every kernel is a walking target.

At times, never the same result or problem from 2.4.15 to 2.4.16. That what was working on.15 for example might not work on.16 with the same options selected.

2) Too many choices of distros. I fully agree choice tends to be a good thing. But the init scripts, directory structure, system management tools (SUSE, RH, Ubuntu) all different. On top of that, each app tends to work out of the box for only a few specific distros. If you want it to work with yours, you have to wait till someone puts it in the package manager. This is where Windows and OS X have a definite advantage.

3) When X crashes or there's some problem with the xinitrc or adding an extra mouse button or adding pretty font support, its meant spending some time reading about how to install it. OS X kinda self repairs itself, and with Windows all else fails reinstall it. If there's a problem with X to begin with, reinstalling just means the same thing will be there after you reinstall. There's been more then a few times when I've just said "Screw that" and went back to using Windows.

4) There's a bit too much Windows-like emulation with the apps in KDE, GNOME and such. Apple tends to think well.... this is ok but we should do this, this and this different. If some of the apps are 'cool' and do things just Neat enough it might entice people to think, Linux is cool, i should check this out.

5) Partitioning / File management / permissions difficult. This has gotten better I think over the years with the file managers with KDE, GNOME, Xfce and such. I just find when you do ls -la on / that you get a confusing directory structure.

As others have mentioned, Linux is such a configurable system it can be like windows if you so choose it to be. That's the point.

Linux/GNU is one (many as a whole, I guess) of those things that it really is a "jack-of-all-trades" if it is understood how to do so. It is used in virtually every form of technology these days.

I personally feel that today Linux is right where it needs to be.

I use Linux on the desktop. I have for years (pushing 8 years now). I currently run Gentoo Linux with XFCE4 as my GUI. It just works for everything that I need to use it for. I have it installed this way on two desktops (my wife's, mine) and my MSI Wind netbook. I also have it installed on my Media Center PC running some custom software I've written myself (pending open source release).

I gave up on Windows completely when Vista was released (by that I mean I've stopped supporting family's PC's with anything that isn't XP -- virtually all of them now).

I run an install of XP under VirtualBox from time to time when I need to do some testing under IE 6 through 8. Although I think it's been a few months since I've done that.

To me Apple is in the same boat as Windows, I just don't want it. I've found what I want on my desktop and it exists here today with very little effort.

On one hand, Linux should remain true to the principles that make unix so powerful in the first place, however if you're that worried about that type of thing, one of the BSDs is probably a better fit for you anyway.

However, unless Linux is user friendly enough (via available add-ons, etc) then it will never get a large enough market share for manufacturers to give a shit enough to release drivers or programming specs.

IMHO - add all the user friendly shit you like. Just ensure that it is up in user-space where those who don't care for all the windows-like crap can strip it out. Options are good. Being a good unix-like operating system and having a shiny Windows-like GUI *available* are not mutually exclusive options.

For users who never need/want network transparency in X, etc (and simply want a free operating system that "just works") it is just another vector for their machine to be compromised via unforseen security vulnerabilities in such features. If auto configuration is done right and actually works, you shouldn't NEED to fuck around configuring things manually. Sure, you may lose nerd cred points, but those of us who have been doing that sort of shit for years most likely by now have better things to be doing than rooting around manually making something work.

User/admin time spent configuring something that the computer can and should be able to do automagically is dead, wasted time that does nothing to help anyone get their job done or solve any of the world's problems. Some people (actually most who aren't in the hard core / look at me I' leet / unixnoob crowd) just want a tool to do a job, and un-necessary time spent rooting around trying to make the tool work is time that could be better spent actually doing something productive.

User friendliness is about being simple, not having more colors or fancy widgets - see Windows Vista as an example.

The way I see it, if Linux were to win in the consumer market, what it needs to do is not more, but less - and do those "less" things 100X better than Apple, Google or Microsoft.

The mess with X is actually being addressed, with project Wayland [freedesktop.org]. The philosophy behind Wayland is exactly simplification - most people don't need that network transparency logic, so re-factor it out and keep the c

It isn't just the distros. It is the desktop environments and all the plumbing underneath trying to shovel in the Fail as fast as they can.

Remove manual configuration. Remove features in general. Allow people who openly hate the UNIX Way to redesign core subsystems, losing important things like network transparency and human readable/understandable settings. Microsoft is ditching the registry because in the end users hated so much they finally had to listen to them while we are still chasing those taillights.

People who care about security hate it too. As does anyone trying to fully uninstall an uncooperative program. Things can stay hidden there essentially forever.

Besides, it's a bunch of settings that is completely unorganized, does not exist as a single file anywhere on the hard drive, and is essentially hidden from normal users. It should be hated on principle.

That plus it makes running a backup into black magic. And we have that going for us now as well, only not because of the registry (otherwise known as GConf). Tried to backup a machine that has someone logged in lately? I use rsnapshot, gotta add in special exceptions lest GNOME hose you because they just have to use features that almost no backup program is going to be expecting to find, files that you can't stat... even as root. Only the owning user can enter that directory, all others lose and go mad.

> Erm, what's wrong with "chmod og-rwx somedir/"? Any decent backup> program should be able to deal with directories with unfriendly> permissions.

Root is immune to normal permissions. Thus backup programs running with root privileges assume they may read any file on the system. Taking a complete backup of a filesystem is otherwise impossible unless you go the dump2fs route and manually frob the raw device file. ~/.gvfs doesn't actually need to be backed up, but having to manually exclude it is a PITA and is certain to grow more exceptions over time.

The breakage of the UNIX API is in the fact it blows chunks just asking what sort of thing that name is and what it's permissions are. As a separate filesystem my configuration of rsnapshot wouldn't try to back it up anyway, but it gets into trouble just trying to determine that it is a mount point.

> Gconf doing that is very rude, and it should definitely stop. Have you filed a bug?

Don't think it is GConf, but somehow tied into Nautilus's virtual file system feature. It is a FUSE filesystem mounted there. But no, there isn't a point in filing a bug report. It isn't a bug, it's a FEATURE! They are using some capability stuff beyond the normal UNIX API as a security measure. Forbidding root from even stating a file is just evil in my book though. Problem is the GNOMES know it breaks UNIX semantics and don't care because they are mostly Windows refugees who were never properly assimilated into UNIX culture enough for them to see the value in it. Filthy Philistines!:)

Same for this Wayland heresy getting started over at Ubuntu. The Computer is the Network, the Network is the Computer. Just words to em, merrily breaking X and the idea of network transparency, not because it will perform better but because the ignorant fools don't realize X's network transparancy isn't the cause of the performance issues they are trying to solve. But mostly because they probably don't personally use apps remotely and don't even realize that they are tossing one of the greatest ideas in computing history down the shitter.

Again, when you get a large influx of immigrants/refugees it is vitally important to ensure they assimilate BEFORE turning over important design work. That didn't happen because of this insane rush to bring about "The Year of the Linux Desktop." In the end we risk letting these hosers screw things up so badly the plumbing gets so screwed up we lose the server and embedded space as well. Those who refuse to learn UNIX will end up reinventing it... poorly.

Same for this Wayland heresy getting started over at Ubuntu. The Computer is the Network, the Network is the Computer. Just words to em, merrily breaking X and the idea of network transparency, not because it will perform better but because the ignorant fools don't realize X's network transparancy isn't the cause of the performance issues they are trying to solve. But mostly because they probably don't personally use apps remotely and don't even realize that they are tossing one of the greatest ideas in computing history down the shitter.

If you're going to throw such statements around, you better get your facts straight. The Wayland developers never blamed the networking protocol for the problems of X, but rather the fundamental architecture of X. In fact, Wayland has been built with networking in mind since nearly the beginning.

I do? The security model makes sense, you have coarse-grained user oriented controls (like UNIX has) and also fine-grained NTLM permissions. Kind of like a file system for keeping small pieces of data.

As does anyone trying to fully uninstall an uncooperative program. Things can stay hidden there essentially forever.

How is that exclusive to the registry? You can at least search through it all pretty easily. If a program doesn't want to be uninstalled there are better ways to stick around than using the registry.

Besides, it's a bunch of settings that is completely unorganized, does not exist as a single file anywhere on the hard drive, and is essentially hidden from normal users. It should be hated on principle.

It's in C:\Windows\System32\config\.. Yes it is hidden from normal users, because it should be. If it's unorgan

If you think it's hidden and want access to it you can use regedit, or better yet use powershell, and you can navigate the registry like a filesystem:> ls -Recurse HKLM:\SOFTWARE\Microsoft | where { $_ -match 'Explorer' }

WTF is this? It seems to spit out an endless tirade of incomprehensible and meaningless shit. For instance:

If you think it's hidden and want access to it you can use regedit, or better yet use powershell, and you can navigate the registry like a filesystem:
> ls -Recurse HKLM:\SOFTWARE\Microsoft | where { $_ -match 'Explorer' }

WTF is this? It seems to spit out an endless tirade of incomprehensible and meaningless shit. For instance:

0 10 FontSmoothing {Type, Text, SPIActionGet, SPIActionSet...}

It's a registry entry called "FontSmoothing", with 0 sub-entries and 10 keys (Type, Text, SPIActionGet, etc).
If you want more info about what PowerShell is returning you pipe the output to get-member, and it'll tell you what properties and methods are available. For example you could add and alter the set of keys returned, or add another where clause to limit your selection to a set of keys you're interested in.

Because it's structured and has a limited number of types you don't need to worry about the various locations or the structure of config files, and can alter and manipulate the returned output.

How is this in any way navigating "the registry like a filesystem?"

Because you navigate the filesystem in a similar way when using powershell, using ls on a registry entry like you would use it on a directory. It really shouldn't be too hard to see the similarity.

I can ls -R/etc | xargs cat and get a completely different pile of incomprehensible shit out of a Linux box, but at least it resembles English.

But neither seem to have any particular use.

If you can't think of a use for it okay, but that doesn't mean it isn't useful.
(By the way that PowerShell is more equivalent to find/etc/Microsoft | ( where read f; do grep -q "Explorer" $f && echo $f; done ))

Feh. If you were making a point, I've missed it. Sorry.

You said the registry was hidden on the hard drive and not accessible to normal users. My point was that it isn't hidden and is accessible. HTH

1) In theory. In practice, it's a fucking chaotic mess.2) No it doesn't. User and system hives live in different files, and then there are a few other hives that are also mounted separately.3) It's the ABILITY to clear those settings that is the problem. Users don't necessarily need to be exposed to every last setting, but they SHOULD have the ability to wipe all settings related to an application. With the registry, this is nigh impossible.

Your defense of the registry shows how you don't understand application and user behavior. The registry is a foul design decision, and up to XP SP2, was accessible by anything for the worst of reasons. Because of its relationship to the kernel, user space, and hardware, it was ridiculously simple to screw it up, or make it the crux of bad behavior in strange, unusual, and bizarre ways. After XP SP2 when user-space was 'redefined', it continued to be the garbage pail for every bad programming mistake ever ma

The registry is a foul design decision, and up to XP SP2, was accessible by anything for the worst of reasons. Because of its relationship to the kernel, user space, and hardware, it was ridiculously simple to screw it up, or make it the crux of bad behavior in strange, unusual, and bizarre ways. After XP SP2 when user-space was 'redefined', it continued to be the garbage pail for every bad programming mistake ever made in Windows.

Obviously there were design features which appealed to Microsoft since they adopted the registry. No one is disputing there are technical merits of that part of it. The point your missing is that it creates an unnecessarily complex and obtuse burden for sysadmins, power users, and developers. The evidence for this overwhelmingly clear and indisputable.

While running everyday tasks with admin privs is a problem, it is certainly not THE problem. For one, an Administrator cannot "do whatever they want with

Obviously there were design features which appealed to Microsoft since they adopted the registry.

I would also like to point out that the standing Microsoft guidelines are: use config files in appropriate places (%APPDATA% for user-specific stuff, application directory for shared things which control how the entire app operates), leave the registry to the OS - as it should be. If you write an application in.NET, and use the stock configuration management classes (or "Settings Designer" in VS, which wraps them and generates strongly typed classes), that's precisely what you'll get, with configs being in

1) Yes in theory its an organized system. A centralized repository rather than a distributed clusterfuck of files. In theory replacing the registry with config files is no better IF the developer chooses to put settings in random files all over your disk.

Ah, I still have fond memories of the day some time in the 90s that NT ran scandisk after a reboot, and then put up a message along the lines of 'Ooops, I just deleted your registry. Guess you're fucked, mate'.

And in the traditional unix world there were no 'settings in random files all over your disk'; system-wide files went in/etc and user-specific config in $HOME, all in nice text files that could easily be read, modified and backed up. The registry is an utter abomination in comparison (and the Gnome's registry turds are little better).

1) Yes in theory its an organized system. A centralized repository rather than a distributed clusterfuck of files. In theory replacing the registry with config files is no better IF the developer chooses to put settings in random files all over your disk.

Ah, I still have fond memories of the day some time in the 90s that NT ran scandisk after a reboot, and then put up a message along the lines of 'Ooops, I just deleted your registry. Guess you're fucked, mate'.

And in the traditional unix world there were no 'settings in random files all over your disk'; system-wide files went in/etc and user-specific config in $HOME, all in nice text files that could easily be read, modified and backed up. The registry is an utter abomination in comparison (and the Gnome's registry turds are little better).

The registry is a database file, why can't be backed up? The story you describe had little to do with anything fundamentally wrong with the registry and more to do with a bigger problem, data corruption, exacerbated by the gawd awful decision to run Windows off FAT.

Traditionally UNIX systems were munted by some admin that made a typo in a text file, that said it was relatively painless to fix. In fact there are some extremely dangerous commands that are typos from something more innocent. At least a prop

I love Linux... but apps dumping config files willy-nilly in my home is annoying as hell. They'll all come up with their own half ass directory convention in my ~/. Sometimes I wish applications were forced to put their config files in specific folders based on user/distro preference like ~/.config/appname/ so I can retain some sanity in my home folder.

The only way I know of to do this would be to have the OS run all apps in a sandbox and map folders for them../userconfig and./globalconfig spring to min

You ever hear of "last known good configuration"? That is a control set. For more info see here [microsoft.com]. Second, why in the hell would you tell ANYONE to type out a registry key anyway? That is classic "Open up Bash and type" thinking and is no more useful in Windows than trying to teach CMD to someone running OSX. Instead you say "your problem is x? here, let me send you a reg key." they run it, and voila! problem solved. I have run into a nasty bug with certain chipsets and the Windows "No Device" found under sound. How long does it take me to fix? About 30 seconds. They run the "Winsndsrvr" key I send them, reboot, and they are good to go. A hell of a lot easier than telling anyone to type anything, and a hell of a lot less likely they'll fuck something up. BTW if anyone has trouble with the "No device" sound bug under XP, just email me and I'll send you the key,works on 2K and 2K3 as well.

As for documentation you can bitch at MSFT about a lot of things, but lack of docs ain't one of them. Go to the MSFT KB site and type in "registry" and you'll find everything from overviews to in depth articles written by Mark Russinovich, pretty much THE guy when it comes to Windows internals. I don't know how many times when I was struggling with Linux I was told dismissively to RTFM only to find TFM was a TODO later.

But ultimately that is the nice thing about today: You have an abundance of choice. Don't like Windows? There is the *BSD, Unix, Haiku, a bazillion flavors of Linux, OSX, etc. Never before have we had so many choices to choose from without having to throw away our hardware and start over. But whether you like it or not the registry works and it works quite well, and combined with GPOs and AD it makes controlling 50 or 5000 desktops from a central location so simple I could teach my 15 year old to run an AD server inside of a month. Since you mentioned Gconf I can assume you are a Linux guy and I've noticed they rarely like anything that isn't done their way, just see all the screams at replacing XServer with Wayland as an example. If a pile of txt files works for you, hey I'm damned glad for ya. But we Windows admins actually find the reg quite useful, and the users frankly never see it, just as they never see the CLI.

If your going to send someone a registry file, you could equivalently send them a shell script for OSX or Linux. On the other hand, its not always possible to send someone files (eg your providing support over the phone and the user cant get online)...

As for documentation, unix configuration files typically have examples and documentation within the files themselves... The registry offers no such equivalent, quite often having some documentation right there is extremely useful and saves you a lot of time... Quite often when trying to fix something you may not be able to access the internet, so online documentation isn't terribly useful.

The registry doesn't facilitate controlling thousands of boxes centrally, there is no reason that text based configuration files could not be deployed in a similar.

I could teach my 15 year old to run an AD server inside of a month.

This is the biggest problem, it may be very simple to manage an AD server in a basic fashion, but the end result is usually horrendously insecure. I have conducted thousands of pentests, and without exception whenever we have tested an active directory domain we have managed to get domain admin privileges (starting with just an ethernet port). You don't want people with only a month worth of experience running your network, you want people with years of experience and a high skill level otherwise you're going to have constant problems.

You have an abundance of choice.

If only that were true, MS has worked very hard to ensure that there are various things locking people in to windows... There are plenty of people for whom choice doesn't exist. My biggest problem with microsoft is that they try to force you to use their products in this way. If we truly had choice, like we do in virtually all other markets, we would all be far better off.

3) all settings files SHOULD be hidden from normal users, be it the registry files, config files or whatever other settings files, if a NORMAL user has need of these to be exposed then the developers have FAILED.

Wrong, or at least I hope to the powers that be that this is wrong.

It is FAR EASIER to open a config file (with comments if it's complicated) and change what I need than to dig through a maze of tabs and menus looking for the magic option I want.

Which is why so much effort was put into the CLI tools in Windows Server 2008.

At the end of the day a GUI is fundamentally limited by the presentation logic, which tends towards a sort of middle ground, and when you have to make configuration changes that go beyond those basic assumptions (as well made as they may be by Microsoft's developers) you suddenly find the utility of the GUI rapidly diminishing.

The fact of the matter is that MS has been in the server game almost 20 years and it's only in the last three or four that it has recognized just how important easy scriptability (and I don't find VB/JScript with WMI extensions easy by any rational standard) of OS fundamentals is. Up until a few years ago writing a Windows batch file to do something basic like add a formatted system date to a directory name was an eye-poppingly difficult task, whereas in *nix with sh and its descendants trivially easy.

The fact is that Microsoft's long-standing presumption that a well-written GUI was the be-all and end-all of server administration was completely false, and forced an entire generation of people who had to administer to do some wildly complicated things, and their *nix counterparts just looked on in disbelief at the sheer awkwardness of the Windows platform once you hit the C:\ prompt.

Anyone can use it, and it will take you to the most common of destinations during normal hours with the minimum of fuss and hassle. On the other hand, you might be forced to take a slow inefficient route, might have to travel at specific times, might have to wait around for the next train/bus and some places just aren’t reachable using public transport at all.

A CLI is like a car

A car will take you anywhere you want to go and at any time, but you have to know how to drive and you have to navigate the route yourself.

Having scripts which are an extension of the same CLI you use for general system management is a huge plus, if your typing the same commands on a day to day basis then writing scripts becomes extremely simple (at the most basic level you can just copy+paste a series of commands you use), far easier than having to use a dedicated scripting language that doesn't relate to anything else.

Most things can be accomplished on a modern unix system without using the CLI, however there are very important reasons why people providing assistance recommend the CLI...If you're providing support via a website, having commands which can be cut+pasted is much easier than trying to explain a gui (following an explanation takes longer than pasting a command, and descriptions of gui elements are open to interpretation and may even be visible if the user has a different theme).Similarly, support over the phone is FAR easier via the CLI, assuming the person your talking to can read and write all they need to do is type what you tell them, and read back the response to you.

This doesn't mean that there isn't a gui based alternative to perform the same operation, its just that the (usually technically competent) people providing assistance to others realise that the cli is the best method of getting the job done.

The registry isn't bad because it's stored in binary form, or because it's heirarchical, or because it supports transactions, or because it has ACLs. These are good (or at least acceptable) things.

The registry is bad because it's global and forces a lot of configuration to be global as well. For example, COM components are registered globally, so only one DLL can be associatded with a class ID at a time. That's why you can only have one version of Internet Explorer installed on the same machine. Yes, users have their own registry subtress, but not every key can be configured under the user-specific heirarchy. Even a user-specific key can only have one value at a time for a given user. Unix systems, on the other hand, use environment variables to hold (or point to) configuration information, which results in a lot more flexibility.

Because registry values are global, application developers only consider the case of running one program at a time. If you want, say, two copies of Outlook, each with different settings, you'll need two separate users. A lot of programs don't even support multiple concurrent instances, which is maddening.

Another maddening side effect of the registry being global is that it's not possible to have the equivalent of NFS-mounted home directories under Windows. Say you have a domain user foo\bar on machines A and B. It's natural to want them to have the same %USERPROFILE% (read $HOME) on a fileserver somewhere, and on Unix, that works just fine. But under Windows, when the user logs into machine A, the system will lock ntuser.dat (the file containing the registry), which prevents the user logging in under machine B. Application-specific configuration files that are locked only during actual changes don't have this problem.

The global nature of the registry also makes it difficult to maintain application configuration: if you want to isolate the configuration information used by a program, you're essentially reduced to looking at procmon output and seeing what registry keys it touches. While in principle programs should limit themselves to storing information under HKLU\Software\Blah\..., in practice, they scatter stuff all over the registry, especially when they register COM stuff. You can't keep just, say, Word's configuration under version control.

When people say they hate the registry, what they mean is that they hate that Windows is not very well-modularized. Isolating one application's registry configuration is like removing one egg from an omelet.

A better model would have been to have application-specific registries, searched according to a PATH-like environment variable. In this scheme, when the system needed to, say, look up a COM class ID, it would just search each registry in sequence until it found the right one. Applications would simply store their configuration and registration information in their own registry, making management easy.

But like most Windows brain damage, this scheme wouldn't have worked on a 386SX with 4MB of RAM [msdn.com] in 1995, which means it can't possibly be changed in 2010. As we all know, design decisions are irrevecorable and eternal (and I'm only half-joking).

Oh, lest I forget: making registry typed was a bad decision. Plain text is a lot easier to manipulate and a lot more consistent for developers and administrators. Is storing "1" really much worse than storing (DWORD)1? (The former is actually smaller if it's NULL-terminated!)

I really don't think storing simple strings in the registry would have hurt performance much either: the registry is explicitly intended for small, infrequently changing pieces of information. The serialization and unserialization aren

The registry is bad because it's global and forces a lot of configuration to be global as well. For example, COM components are registered globally, so only one DLL can be associatded with a class ID at a time. That's why you can only have one version of Internet Explorer installed on the same machine.

Is the registry really the reason you can only have one version of IE installed?

Firefox uses the registry and I have more than one version of Firefox installed on a machine.

It's natural to want them to have the same %USERPROFILE% (read $HOME) on a fileserver somewhere, and on Unix, that works just fine. But under Windows, when the user logs into machine A, the system will lock ntuser.dat (the file containing the registry), which prevents the user logging in under machine B. Application-specific configuration files that are locked only during actual changes don't have this problem.

Not to derail your insightful post, but this is one of the main reasons I switched to linux. You can actually place system folders on different partitions so that 1. fragmentation of cat pictures doesn't slow down the OS, 2. the OS can be wiped while retaining user data. It used to take me a whole day to force Windows to install like that - where Documents were on one partition and Program Files were on another, pagefile was on another, etc. That was several years ago, and now I tried doing some of the s

Actually, yes, that's part of what's wrong with it. For instance, let's say I have a registry problem that's preventing a proper boot of the machine, and a Linux CD. I can boot the machine using my Linux CD, mount the logical disk containing the registry, but then what? I'm limited in my ability to fix the registry because in order to do that I need the tool that the broken registry is preventing me from accessing. By contrast, if I have an alterna

> Do the Linux guys WANT to step up and compete with OSX and> Windows or not?

I have been seeing this word used all evening. I do not think it means what you think it means. I think the word you are looking for is copy.

We DO compete. At this point the Linux desktop, warts GConf and all, works at least as well as Windows and if you don't happen to agree 100% with Steve's Vision of the Way it works better than Apple's offerings.

> The world has spoken, and editing configs and CLI is a giant DO NOT WANT.

If the price for marketshare is to design a system for idiots then I don't want those users. I'm NOT an idiot and a system designed for idiots would slow me down. Seriously. Do me a favor. Get a VM up and running and install something that by virtue of what it IS must be complicated. Say Squid for example.

Now I want you to use your favorite text editor (hint, a CLI is not required if you are on the local machine) on/etc/squid.conf. See how it is almost complete in and of itself, practically making external documentation excessive. Detailed documentation right there beside the configuration items which need to be adjusted. And it is a plain text file so you can put it into a content management system to track changes, especially handy if multiple people will be making changes. And as a text file it is about as simple to edit it from ten thousand miles away as from the system console.

So tell me, how would you improve upon that method of managing Squid? Would this be the best way to manage Firefox? No. And Firefox on Linux is configured in almost exactly the same way as it us on Win/Mac because for Firefox that is the easiest way.

> They want hand holding, in short thinking should NEVER be required..

And this is the great divide. What are computers? Interractive televisions for the mindless or levers for the minds of humans? One paradigm probably can't be extended to perfectly cover both use cases.

More or less true, if a bit on the cranky side. Me, I don't want to compete. I'd much prefer Average Joe stay the hell away from Linux. Mostly because I don't want Windows-think infecting overall Linux design decisions. If they want to fork and do things their windowy way, fine by me but don't screw around with stuff the proper geek distros depend on without forking.

Also because I can now say that I haven't used Windows for two versions and have no idea how to fix your computer. Yes, I know I probabl

> Do you want your doctor spending his time figuring out which confg> files he needs to edit, or researching better ways to keep you alive?

I'd rather the Dr. have a skilled admin maintain a stable and secure Linux based network for his office. And not be hitting patients with the "the computers are down today" crap or "the computers got infected, your information went to Russian gangs, sorry bout that dude." You talk outta yer ass like there is an option of a foolproof computing platform that doesn'

I keep seeing people saying that they're 'seasoned users' who need a 500 page manual to figure out how Linux works, but I installed Ubuntu on my netbook a couple of weeks back and.. it.. just... worked. Even on my laptop, which is a far more complex system than the netbook, the only things that didn't work out of the box are a few of the special keys (e.g. play/pause).

Has anyone who's complaining about how hard Linux is to use actually tried a distro released after 1993?

> A Mac and PC user could switch computers and withing a few minutes either person> could get done what they were intending to get done. Not so with Linux.

Yes so with Linux. I admin a lab in a public library. We give em Linux with NFS mounted home directories and none of the locked down bullcrap Windows every other library in the State offers. They figure it out pretty quick. Hint: people who depend on the lab PCs in a public library aren't UNIX geeks. Hell, it wasn't too many years ago a fair chunk of them couldn't even hold the mouse right. But not long after they get comfortable logging in/out and using the rat they manage to figure out Mozilla/Firefox, OO.o and the usual application suite. Yea we have had our share of USB pen drive issues from time to time.... of course the other libraries in the State running the Gates Foundation's library model keep the USB ports disabled entirely. Same with CD burning, it works stable these days, didn't used to be the case especially if we bought too far down the CD burner food chain. Again, the other sites disconnect the optical drives unless they need to load new software. After all, gotta 'prevent' the spread of malware. Windows IS malware.

That may have been the theme of the article, but I think you're vastly underselling Linux. Now obviously most people aren't as comfortable with Linux as they are with good ol' Windows, but I am sure it's just a matter of perspective.

I am fairly technically competent, so I'm perhaps biased, but I frankly don't see how the rest of the planet stands using Windows any longer. I dual boot Ubuntu and Windows XP on my laptop, and my usage consists of running Linux all day for all tasks, and switching to Windows so

Yeah, of course. Windows hardly works at all, and is a pain in the ass to use. Please allow me to share my own anecdote:

By fate, my wife and I were both reinstalling operating systems at the same time. She was installing Windows (Vista or 7; I can't remember) on her laptop, and I was installing Ubuntu on my netbook. Both installations took a while, and both succeeded. Then, that nite we wanted to watch a movie. Again by fate, we each had the movie on separate USB thumb drives, so we both popped them in. Yet