Desktop Linux

Difference engine: Free is too expensive

LINUX, the free operating system that brought professional-grade computing to the lowly PC, has come a long way since doing something as simple as switching off meant performing secret handshakes or offering arcane prayers to the computer gods (eg, “computername ~ # shutdown -h now”). Today, practically all Linux distributions (some 450 are in circulation) hide their stark command lines behind prettified user-interfaces such as Gnome, KDE, Enlightenment or Xfce which mimic the desktop metaphor familiar to a billion Windows users. Should it ever be necessary, shutting down a Linux machine gracefully nowadays involves no more than a few clicks of a mouse.

Your correspondent has been a Linux fan since discovering the charms of Turbolinux, an early Japanese distribution, back in the 1990s. After the tribulations of Windows NT, he was pleasantly surprised by how easily Turbolinux resurrected a geriatric Pentium machine to give it new life as a print server in this newspaper's Tokyo bureau.

Once set up, the Linux box just ran and ran without ever missing a beat. There were none of the viruses and other malware that plagued Windows and even Mac machines to worry about. And, if needed, there was a handy package-management tool for downloading and installing additional software. Later, as developers started to tailor Linux for use on the desktop, your correspondent migrated to Caldera's OpenLinux and then Xandros, Knoppix, openSuSE and Kubuntu, before finally settling on the KDE version of Linux Mint.

The best thing going for Linux Mint has been the way that, while subscribing broadly to the principles of FOSS (free and open-source software), it cheerfully incorporated proprietary drivers, codecs, utilities and plug-ins like Adobe's Flash. The aim has always been to make life easier for users, rather than appease the open-source priesthood.

One criticism your correspondent has, though, is that when upgrading to a later version, Linux Mint requires users to do a complete re-installation, rather than a rolling incremental update. He understands the reason why, but considers it an unwarranted chore. All the more so as new releases come every six months, fast on the heels of the latest version of Canonical's popular Ubuntu (upon which Linux Mint is based; Ubuntu, itself, is based on Debian).

Not that users need to install every new version that comes along. But most of us have grown up believing, perhaps naively, that software tends to improve with development. And so the compulsion is to download a copy of the latest, greatest version from the distributor's website, burn the “ISO” image onto a CD, reboot the computer, answer a few questions, and let the installation whirl away. Meanwhile, numerous additional packages, left off the CD for space reasons, are downloaded in the background. Setting up a modern Linux distribution as a fully fledged working system, with all the applications, drivers and tools you are ever likely to need, can be as easy as that.

Or, rather, it used to be as easy as that. Linux Mint 6 (“Felicia”) was a dream to install and use. But with successive versions, niggling problems have crept in. Some versions would not recognise a printer, an audio card or a wireless network, requiring tedious workarounds. Lately, video drivers have been another source of complaint. Power-saving features, which work fine on one laptop, refuse to do so on another. Several releases have proved so flaky that it has been easier to delete them and go back to an older, more stable version.

The most recent release, Linux Mint 12 (“Lisa”), has been the most frustrating yet. Your correspondent wasted much of last weekend trying to get it to work on one particular machine that has always been a model of good behaviour. But Linux Mint is not the only offender. Ubuntu has been plagued by even greater woes. It is beginning to look as though this deterioration in software standards is beginning to hobble all Linux distributions destined for the desktop.

Ubuntu's problems seem typical. They stem, at least in part, from the way developers have tried to make desktop versions ever more attractive to non-technical users. Like Apple with its OS X operating system for Macintosh computers, Ubuntu has embraced the “we know best” approach to desktop design, offering users less and less freedom to change the interface's look and feel. In the latest release, Ubuntu users are given either the minimalist Gnome 3 version of the user-interface, or a proprietary iPad-like interface called Unity. Both have driven many a long-time Ubuntu user nuts. Even Linus Torvalds, the father of Linux, has called Gnome 3, in particular, “an unholy mess” and has unceremoniously dumped it.

Meanwhile, the default for Linux Mint 12 is either Gnome 3 or a customised version of Gnome 2 called MATE. Neither has won rave reviews. For the initiated, both Ubuntu and Linux Mint can also be had with the venerable KDE interface. Many view KDE as being less intuitive, but a good deal easier to tweak for individual needs.

That said, even the latest KDE distributions are proving just as annoying to set up as Gnome versions. Your correspondent blames the rapid upgrade cycle for leaving too many features with rough edges, too many wonky drivers and utilities, and too many unchecked regressions (bugs caused by changes) in the kernel. All that Linux developers seem to want to do these days is add cool new features, rather than squish existing bugs and make the software more useable.

The problem is compounded by the way Linux has grown over the years into an ungainly edifice, built upon thousands of individual packages of computer code that have been stapled together. Contrast that with the strict quality assurance imposed by Apple and Google over their Unix- and Linux-like operating systems for tablets and phones. Both rely on just 100 or so tightly integrated core packages that have been carefully scrutinised for regressions and inconsistencies. Compared with Linux, the iOS and Android operating systems are remarkable clean and robust. With the quality of the underlying software a given, it is no surprise that developers have been able to write hundreds of thousands of effective apps for the two platforms.

Meanwhile, reports of Linux's death have been greatly exaggerated. Linux enthusiasts, naturally, continue to see a bright future for the free operating system—pointing to its 1% share of installations (compared with Mac's 7% and Windows' 92%). It should be noted, however, that Linux accounted for around 2.5% of installations a decade ago. And while server editions of Linux continue to gain ground, desktop versions seem to be going nowhere.

To succeed on the desktop, Linux needs to penetrate the office. Unfortunately, there is no such thing as a single Linux to go up against Windows 7. What there is instead is a fragmented field of hundreds of different Linuxes, each with its own learning curve, skill set and maintenance needs. Even the top five distributions (Linux Mint, Ubuntu, Fedora, openSuSE and Debian) cannot offer a big enough user base to attract adequate support.

That is what is wrong with desktop Linux. Hobbyists and enthusiasts may be willing to invest their own time and effort to keep a desktop Linux running. But the corporate world cannot afford such luxuries. In business, the biggest single computing cost is not software licenses, but the salaries of the support staff. And as far as licensing fees are concerned, the biggest single cost by far is not for operating systems but for enterprise applications.

In the circumstances, systems administrators do the rational thing: they install Windows machines on every desk, pay the Microsoft tax, and sleep easy at night, knowing there are plenty of maintenance people to keep their Windows networks running smoothly. Your correspondent, having wasted too much time maintaining Linux on the desktop, is about to do the same. Now let the angry ad hominems from the Linux faithful commence...

As long time user of Linux (Ubuntu), I can attest the writer's point. The keyword indeed is - "too Many".

Apple & Windows already had proved that, absolute majority of users wish to use their systems to get from point "a" to point "b" - and care little about fancy styles.

More choices lead to more "moving parts" which can go wrong.

As pointed out. Although there is only one version of Kernel (released by Linus Torval), It would be nice if all top distributes would get together, and come with one version of Linux and one GUI interface which work.

My company provides IT solutions for professional clients, based 100% on Linux and free software. Microlinux (http://www.microlinux.fr) installs Linux networks in small town halls, public libraries, schools and small companies. The author of the article above may have a decade or more of Linux experience, but I'd say - politely - that he's still clueless.

I don't agree with this article at all either. I'm an IT Manager at the "Corporate level". All of our more that 100 developers all develop on Ubuntu.
1. Yes it is too bad that there isn't on Distro to stand against the world.
2. Ubuntu upgrades are role forwards, there is no reason whatsoever to reinstall complete.
3. To expensive as is the title? well If hiring individuals that are a bit more talented that the point and click low end windows admins. Yes. I don't mean all windows Admins are pathetic. There are some really talented individuals out there that use powershell and a few other modules.
4. My corporation is mainly linux, we manage a local repository, and puppet servers to manage updates, we customize Ubuntu for our needs and are quite successful.
5. Is it easier to role out windows? the move to each windows version is a full reinstall. Seems to me a bit problematic.

- FOSS is an exclusive club, and there are many benefits to being exclusive
- Linux is ran by at home and offices by people who collectively amounted enormous knowledge, just check IRCs - MS or Apple supports are intellectual dwarfs compared to Linux forums and IRC
- I contributed my code to couple FOSS projects, when last time you did it for MS or Apple. You don't give a ... and why would you
- There large number of FOSS initiatives and projects made into main street w/out users even realizing their origin.
- Freedom to fail or to succeed is a great thing!!! And Linux is messy ... and as Mark Styen said "Freedom is always messy, but alternative is much worse"
LLL - Long Live Linux

That's unfounded rubbish. BSD isn't that much older than Linux so they're both "legacy OS". I think some of the changes made to the kernels over time show significant changes to meet the needs of modern users and get the most out of modern hardware.

I think the point that fupjack was making is that whichever BSD you choose the userland is the same and when tinkering with different servers that is a blessing and `ports` are not an integral part of the OS. For the novice PC-BSD is also surprisingly pleasant.

To accuse the author of being paid by Microsoft is the typical overreaction that the author hinted at the end of his article. He's explained his history and extolled the virtues of Linux over Windows through the article. He's also explained the problems he's now having. To accuse him effectively of being too stupid to set up his computer is to confirm the point he is making and that Neal Stephenson made in "In the Beginning was the Command Line". The thought of an army of geeks is out there ready to help you solve your problems isn't necessarily reassuring!

I haven't done a Windows install myself for a while but my experience of Windows 7 which I use occasionally is that the user experience has got much better. I'd certainly agree with the assertion that Apple and Google provide the best user experience - things generally just work though I do wish Apple would stop mucking around with the Posix stuff and just provide better ports/brew support.

10 years ago I'd be flaming you, but now I would agree. I recently setup a laptop with Ubuntu and it ran considerably slower than the windoze installation that it replaced. Most of the Linux applications I use now are command line versions of Linux installed on microprocessors. I also run Linux in VM's on my windoze machines and also run some Android machines which are Linux based.

10 years ago, I'd never have a Linux machine crash unless I did something stupid. Now, with Gnome the crash rate is as frequent as with Windoze. That said, a commmand line based version of Linux is rock solid and I've had microprocessors running for years failing only when the power went down.

One of the things that I do with my computers is never upgrade a working system unless something is broken. I still use the 1988 version of M$ Word on a BasiliskII Mac emulator. This version of Word was the best that they produced and it's been a steady downhill progression ever since. I use Open Office when I don't feel like going through the byzantine steps of getting a document exported from BasiliskII.

There seems to be a widespread tendency to fix things which aren't broken AKA adding unnecessary features. On a modern word processor I don't use 99% of what is available and the only use I find for the built in VB compiler on M$ Word is to hack into supposedly locked down systems via VBA. What is desperately needed is a simplified version of Linux that runs well and is bulletproof. Windoze is the epitome of bloatware and Linux shouldn't be heading in the same direction.

In 1998 I managed a site with 3 proprietary Unix servers and a couple of dozen running Linux (the Redhat variety). The big differences: the proprietary boxes (a) cost money, (b) had a support contract, (c) had manuals explaining all the system messages and (d) worked most of the time. The Linux boxes were free but failed on all the other counts. Linux was a nightmare then and nothing seems to have changed in 14 years - "free" software was always expensive.

Many fine points here, and the issues described are indeed obstacles for the
average computer user looking for a no-fuss replacement for Mac's technological
tyranny or the constant maintenance nightmare that is MS Windows. Every Linux
desktop user is familiar with the manifold ways it can filch a weekend or
nullify a piece of hardware.
Of course, there are large, abiding problems with Mac and Windows platforms, and
for any of the above arguments and conclusions to carry any meaning, a
thoughtful analysis would be required comparing the average time/money expense
associated with set-up and maintenance for each of these operating systems.
Having used all of these platforms extensively, I claim Linux is hands-down the
easiest and lowest maintenance platform for professional and personal use.
In any case, complaining about the state of the Linux desktop today is
comparable to whinging about the autos Japan was turning out in the 70s -- or
Korea in the 80s. Technology evolves toward its maximum usefulness, and since
Linux desktop development only began in earnest in the mid-naughts, it's had
less than a decade to shake out problems and achieve ease-of-use (consider the
abysmal state of Microsoft technology in the late 90s -- or today, for that
matter). It's well publicized that major Linux distros like RedHat/Fedora are
rapidly moving toward rolling releases, for example.
In fact, the author is clearly knowledgeable enough about Linux to know that the
problems he cites are being addressed and improved upon -- which knowledge
should have excluded a conclusion of the Linux desktop's death. It's enough to
make a reader suspect a cynical motive on the writer's part, especially as the
article comes right on the heels of RedHat's announcing itself a billion-dollar
company and The Register describing the remarkable cost savings the city of
Munich enjoyed by switching from Windows to Linux. Ah, but calling the author
cynical would be engage in that ad hominem attack he warned of his last line,
wouldn't it? So I won't do it.

I have read Economist for a long time and consider it a preeminent news service provider. However this poorly researched poorly backed single user experience based article questions this magazines ability to comment on state of technology affairs in the present - especially in terms of costs. The article had a ring of articles of the past computer magazines, where Gods of computer appeared and guided the masses to the best systems to use. We are where we are because of these Gods.

Having said that with proprietary Hardware open source drivers always have an issue. This causes the Hardware vendors to delay releasing drivers in time to the open source community. As long as one stays within supported hardware I - a user of Linux on the desktop over 20 years (started with Yggdrasil with 0.99) had little trouble.

1. It is free
2. It is easy to install
3. It works on older hardware
4. Not constantly nagged by update requests from anti virus and firewall providers, adobe, machine manufacturer updates, windows updates..... (by the time you finally get to use the Windows machine you forgot what you booted it up for).
5. Easy to use but also provides power to get real jobs done as opposed to just watching youtube videos and making facebook updates or as the press seem to like to do.. tweet. Ooh I'm having porridge for breakfast today.
6. If I use an application (for example Banshee) and I realise there is a bug or a new bit of functionality that I require I can either download the source code and make the change myself and recompile (ok takes developer knowledge) or I can make a request to the team that writes Banshee to add the function. Try doing that with a commercial windows application... you will pay for the whole product again.

I never understand how people who write these articles think. I know unity can be frustrating if you are used to using something else but is unity any worse than Microsoft bringing in a ribbon bar for everything.

Choose a desktop and if you don't like it choose a different one. Simple. Choose an application to do the job at hand and if you don't like it choose another one. What is so difficult?

I can do anything I require in LINUX, I don't need Windows anymore. I haven't needed windows for a number of years (10+!!!).

No offense intended, but the article brings to mind the old CM Kornbluth book, "The Marching Morons", about a push-button socity of the future where the machines have to have all the learning, because the people lack it. I've been using one flavor or another of Linux for 10-15 years now (and systems like DOS and Mac OS before that, and I can say without hesitation that a PC is not a toaster. It's not even a television (although I use it as one sometimes). The difference between a PC and a household or office appliance is flexibility: it can do a lot of different jobs that may not have been in the original spec. My own machine, for example, allows me to write programs, watch tv and DVD's, play games, read email, write text documents (including several novels), keep up with news and technical matters ... the list goes on. But things do go wrong from time to time. Nothing is perfect, and you need to be knowledgable enough to deal with it, or know someone who does. One program on the Amiga personal computer had a series of "User Stupidity Error" messages, where the blame was placed squarely where it usually belongs: between the chair and the keyboard (the phrase "garbage in, garbage out" springs to mind). That type of error is still prevalent today, complicated by the complexity of modern systems built against impossible deadlines with long tiring hours and inadequate budgets Ignorance may indeed be bliss, but it won't help you fix what's broken. Not these days.

Many people criticize Linux systems for depending too much on the old-fashioned command line. Yes, it still exists on modern versions of the oeprating system, and yes it's still important. Oh, I don't consider myself a guru with it, But I know enough to save myself a lot of time writing tiny shell scripts (like DOS batch files only 1000% better). How does that save time? To save and maintain a backup of just those directories I'm interested in, I press the key and type "Backup". If I want to record a TV show for later viewing, I hit and type "Capture myshowname.avi". That's it! No fancy GUI, no messing around with menus and dialog boxes. It's all simple and elegant -- provided you know what the heck you're doing! That's the key.

I, too, have moved away from Ubuntu. First I tried Mint Debian, and now I'm on Mint 12 with KDE, and I did it without backing up or moving my documents or personal directories. There's a way to do it, you see, Just like there's a way to install a modern version of Linux without hastle or grief. Computers aren't toys. They can be made the basis for toys (modern tablets come to mind), but PC's -- even running Windows -- aren't for the technologically challenged. You wouldn't give one of those fancy table saws like you see on This Old House to someone who didn't know which end of a screwdriver to use, right? A PC is a tool, too. Windows dumbs down the computing experience to make it easier for novices to use them. But without some sort of tech support available, that only gets them in trouble. Fine, the saw's turning. Now what?

The author is exactly right. In fact, I kept thinking that I wrote this article myself. In fact, I'm sure I have in so many forum responses. I mourn the death of Linux. But it's just a few geeks that have an interest in Linux these days.

Linux is like herding cats. There's no way anybody can manage all those distros. And when it comes to creating applications, forget it. In fact commercial software vendors did forget it, years ago.

Redhat delt a major blow to desktop Linux when they got out of the market. Redhat was widely recognized by the general public. But when Redhat left the desktop Linux exploded into a constellation of distros. That pretty much put desktop Linux to death for business use.

I was once a Linux advocate but the only use I have for it now is to keep my old Pentium III running at home for web browsing. The software I need at work is not available. And the chance it ever will be available continues to fade with every new distro.

There was a time when software vendors talked about creating Linux versions of their software. I haven't heard a word about that in years.

The main reason I migrated from Windows (though almost all my dozen computers are dual booted with Windows and Linux)was primarily for reasons of security. Anti-virus software not withstanding, I certainly don't regard Windows as a secure system. The free nature of Linux is a nice bonus but hardly the point. Occasionally I need Windows to run a specific program but I try to remember to unplug it from the internet when I do.

Hilarious that an article that pretends to be written by the "everyman", complaining how hard it is to use Linux -- goes into depth about the Debian heritage of Mint and Ubuntu, his preference for a rolling release distro, and complaints against GNOME 3 and Unity.

It seems more like a Saturday piss-and-moan by a disgruntled technophile. There are good bits, but it's hard to pick them out from the miasma of petty bitching.

No kidding. BSD is using the same desktop environments and it's an unholy mess, because it's a legacy OS. There's much better KDE and Gnome support in Linux.

This article is very biased, because KDE and Gnome are software compilations and Windows or OS X are much less. The title of the article sounds like MS FUD btw. I'm using Kubuntu and compared to Windows 7 I'm far more comfortable with it and it's faster. You've probably missed BSOD in OS X (yep, OS X!) problems or broken applications in Windows after installing newer service packs. I wonder who paid you?