Debunking The Oatmeal and the perception of Linux as difficult to use | Opensource.com

The Oatmeal made a webcomic that's been reformatted and recently passed around Facebook and other social media. It's titled “How To Fix Any Computer” and pokes fun at Windows, Apple, and Linux each in its own way. And although I love The Oatmeal, this comic’s screed on Linux promotes a myth that needs to be dispelled. The "How to fix Linux" instructions begin:

Learn to code in C++. Recompile the kernel. Build your own microprocessor out of spare silicon you had lying around. Recompile the kernel again. Switch distros. Recompile the kernel again but this time using a CPU powered by refracted light from Saturn.

Nothing could be further from the truth. In my experience as a professional Linux systems administrator, people who compile their own kernels are at least one (and usually two) of the following: kernel developer, avid hobbyist, or doing it wrong.

Part of the problem is that since Linux has always been a favorite operating system for hobbyists and programmers, people assume that you must be a hobbyist or a programmer to use Linux. Combine that with the bad habits and assumptions learned in other operating systems, and you get a comic like The Oatmeal wrote.

What really makes Linux challenging for new users are the bad habits they've brought with them from other operating systems. For example, let's say that you went to a retail store and bought a new wireless card. You brought it home, installed it in your system, and—if the system is Windows—you probably put the CD in the drive and tried to figure out from there what to do to install its driver. Did Windows bring the “new device detected” dialog up and ask if you wanted it to look for a driver? Maybe. If it did, should you click “OK” or “Cancel”? Well, that's hard to say—some devices will work one way, some devices work another. What's on that driver CD? Is it just the driver files that Windows' own installation wizard should be pointed to, or is it an executable that does the whole installation for you? Again, who knows? It's also entirely possible that the driver on the CD is too old and outdated to work properly on a modern machine. Maybe after discovering that you can't connect to your WPA2 network, you'll have to go to the manufacturer's site and download a new driver. And so it goes, on and on (and on!) from there. This is considered normal, and nobody likes it, but most people are accustomed to it.

So it's no surprise that the same person with a Linux computer might immediately start looking for a driver on CD, or a driver to download, or something else to make things difficult, where in fact, it should Just Start Working immediately, no installation required. Of course, I picked on wireless cards for a reason. Although most of them are supported out-of-the-box under Linux, they’re still the class of device most likely to not be supported, due to factors like a bewildering array of constantly changing chipsets and regular efforts to lower costs. But, being accustomed to having to do everything the hard way because of other operating systems, a user is likely to make the mistake of delving into 20-page long forum posts with talk of madwifi and encapsulated Windows drivers and blacklisted kernel modules, until ten hours or more deep into the rabbit hole, they cry, “Linux is HARD!”

Whereas the proper response was to realize that although most hardware Just Works with no hassles under Linux, this particular piece didn’t. The solution would be to simply exchange it at the store for something that does.

Users’ own expectations of frustration is generally their own worst enemy. Because other operating systems have taught them that everything is frustrating, and you just have to deal with it, they consider that experience normal. But it isn't, and it shouldn't be. And that makes Linux the better option—not the more difficult one.

I've been using Linux off an on since 1999, and on completely since 2006, and I have never, ever recompiled a kernel. I can count one one hand the number of times I've even needed to build something from source.

I am a huge open source advocate, and long time Linux user, but I have to push back on this one a little bit. I will agree fully with "Linux *should* be "as easy as Windows" " these days. However in practice it simply is not the case. Some examples:

1) Fedora's NVIDIA support (or lack of it). I had to learn about kmod, akmod, how to disable the system's attempt at installing the driver to install the one from NVIDIA, install the kernel headers in order to compile NVIDIA's driver, etc. This drove me crazy enough to abandon Fedora and switch to Ubuntu (and I (claim to) know what I'm doing a little bit).

2) Too many obnoxious errors. Clearly Windows has plenty of obnoxious errors, but significantly many fewer. I still can't figure out how to get the default IM client in Ubuntu 11.10 to not make me "click ok to access my password wallet". When I first installed 11.10, the package manager would simply not let me install packages because I was not root. Only after hours of Googling I found out that I had to install some other polkit-like package to allow the package manager to prompt me for a root password like it should. The list goes on and on.

3) Hardware support. I bought a Canon laser printer, and it turns out Canon does not make drivers for Linux AT ALL (and neither does anyone else for this printer), so I cannot print to my printer from my Linux (main) machine. Even though I've been a Linux guy for many years I didn't think to check that before buying it!

4) Software support. There is an open source alternative for almost everything now. Unfortunately, most of them are terrible. I will not touch Libre Office with a 10 foot pole. There is no reasonable 3D CAD (SolidWorks style) software. The development IDE's can't touch Visual Studio, etc.

----
The Litmus test should be "would I recommend this to my grandpa?". Of course if grandpa will never touch anything except a web browser, sure - in fact I'd argue that most people would not even know which operating system they were using as long as you show them where the web browser icon is. However, if you expect grandpa to get frisky and want to upgrade anything besides his RAM, you'd be crazy to recommend Linux.

I'd say it is about 90% there, but the last 10% no one seems to care about fixing, and that is the 10% that people notice.

I'd change "Linux is no longer hard" to "Linux is no longer 5x as hard". In my opinion it is still much much harder than it should be.

I'm assuming you're talking to me since I used the phrase. I don't know what makes someone a "Linux guy" besides using it at home and at work? I think you're confusing a "Microsoft" mentality with a "likes things that work well and are easy to use" mentality. I don't care who makes it - I just want it to be usable and not have any of the glaring problems I mentioned in my original post. That is clearly a lot to ask, but I guess I overestimate the power of millions of programmers that could be working together on this.

Many of the other comments alluded to the stereotypes the Linux community perpetuates, and we are seeing an example of it right here. Apparently you couldn't possibly be a real Linux supporter because you found a few things that Windows does better.

This is just like politics. For the zealots, the 'other' point of view can't possibly have any good ideas. Nope, not possible. Some people cling to the most polarizing views for seemingly no reason, while others simply want something that works and they don't have an agenda.

I don't think you are overestimating the power of millions of programmers, but you are overestimating the ability of certain types of people to look beyond their own pride. Programming has just as much of a human element as anything else and until people learn to let go of the ridiculous points of view that spur comments such as "he's a Microsoft guy" then we will be stuck with what we have. Collaboration requires give and take.

Nvidia is *not* open source or linux. [Aside: I realize that Nvidia is probably the highest performance consumer video, but it wastes ungodly amounts of heat and power - and I avoid buying it for myself. It is reliable enough on a desktop, but all my friends with Nvidia laptops (under both Linux and Windows) have trouble with overheating while watching videos or playing games (maybe more the laptop makers fault - not leaving enough margin for the heat from Nvidia products). Personally, if it's normally low power and can play supertuxkart and DVDs without overheating and without special drivers, I'm good. ]

For older Nvidia cards, the built-in Xorg nouveau driver supports them out of the box, and even has sufficient 3D performance for my modest requirements, including gnome-shell. I would use them on a desktop.

But any problems you have with installing proprietary drivers is not a linux problem. Although it *is* a business problem if said proprietary hardware is mission critical - in that case you are better off running whatever proprietary system they support.

However, Nvidia is more and more a special case. Most hardware makers wrap all their proprietary and patented stuff in a standard interface, e.g. USB, bluetooth, SATA, etc. At some point, there will be a standard OpenGL (or successor) hardware interface, so that the latest graphics cards will not require special drivers.

Sorry, no. I've only used multiple monitors with Intel graphics. Intel 9xx and earlier has a nasty "feature" where if the total pixels of the combined monitors exceeds a hard limit, it will automatically and silently exit 3D mode. Not fun when running gnome-shell and you plug in an external monitor that is too big... But otherwise, multiple monitors work great.

Having fallen foul of a typical Microsoft OEM trap (I changed my failed CPU with a non Fujitsu/Siemens accepted unit) some time in 2004 I changed to Linux, originally Red Hat from a book I had discovered.
Since a year following its introduction I've used Ubuntu. Two major (to me) issues back in 2004 was the Windows internet card and a Canon printer, it took great patience (enhanced by my anger with the Microsoft way) but I finally found out the reason for the problems and purchased a new card that did not rely on the Microsoft OS to operate, a new HP printer was purchased following a good look at the Red Hat forum of the time (nowhere near as good as the present day forums).
Since those two simple purchases I have checked for compatibility whenever making more. Not once have I been disappointed, not once.
My last purchase was a lap-top a year ago, it was for my wife who is in her 70's and had never used a computer in her life. My one stipulation, other than high end components, was that it came free of "free" Windows, not easy in this silly OEM world of computer manufacturers. As soon as I received one from Novatech (a UK company) I installed Ubuntu 10.04, everything was up and running in an hour, including its wireless connection to the internet via my local net. It was immediately connected to my printer, by now a HP Photosmart C 8180, the lap-top had full access to the printer and all of its features. There was no difference between the lap-top and my own machine. Over the past year I've gradually increased my wife's ability, now quite considerable. She is not a computer hobbyist, she just uses it for access to the internet (for her own hobby which makes great use of the printer), e-mail and Skype. A friend tried to show her Windows but she just could not accept the lack of logic with some of its uses. Trained on Linux she finds Windows unacceptably difficult.
I used computers for much of my working life, many years before the PC, so I am an experienced USER of a computer, I cannot program and have never felt the urge to learn. There have been many changes over the 40 or more years since I first sat in front of a keyboard (punching tape), most of the changes have been accepted readily, some have been resisted mightily, but none resisted with the energy of a possible change from Microsoft.

I grant that back in 2004 I found Linux difficult, but now, definitely not.

1. Mesa is so much stronger now, you don't have to install nvidia drivers any more. If NVidia does not give good Linux drivers, go to their forums and complain. My ATI does not work on Linux, its because AMD sucks.
2. I did not understand why no one in the world has that problem in 11.10 and you have. May be mark shuttleworth sent you a different cd.
3. You have been a Linux guy for so long, you remind of me, when I was using Linux for the first time.
4. Visual studio does not work, because world has move on and the company who produced Visual Studio is trying to find their own grip back in the software world. It shows if you don't care for all, you are going to be wiped out.

I see the 10% as this. The 10% issues that there is with Linux is never going to be solved, but when you move on to next level of computing you will not be a slave of those 10%. E.g if you use google docs (say you happen to like it some way), you will not need libreoffice or any other Desktop client. For your canon printer, may be you will require less and less to print when you have a decent computer that you can carry around and read on it just like you would do in books, prepare documents and people would accept that instead of those paper ones.

Its time move on. Even the giants who wrote Desktop clients for you want you to move on to the next level. They don't care (I mean no one) for our present computing pattern, so its time to move on.

"Learn to code in C++. Recompile the kernel. Build your own microprocessor out of spare silicon you had lying around. Recompile the kernel again. Switch distros. Recompile the kernel again but this time using a CPU powered by refracted light from Saturn."

This is obviously meant to be exaggerated, but stereotypes exist for a reason. It wouldn't be funny if there was no factual basis whatsoever.

I agree with David unequivocally. It is probably 90% there, but the majority of the work is always in the last few percent and that's where actual usability is also found. You can paint a room in 2 hours, but 15 minutes of it will be painting the vast open spaces. The majority of the time is spent doing the trim and joints. This concept applies to everything, including OS development. Linux has unpainted trim and joints.

I use Linux ~10 hours a day at work and it is absolutely essential to getting my job done. However, Linux is by no means the better option for the average user, nor was it ever designed to be. The article is trying to suggest that people apparently develop bad habits by using Windows, but that is just ridiculous. Linux has just as many, if not more, annoying quirks depending on the distro, which is another huge problem. Depending on the distro, you do X to fix problem Y, or maybe you do Z to fix problem Y if situation A arises. In Windows, if it's broken, you only have different versions of Windows to deal with instead of different distros coupled with different kernels and different open source licenses that prevent you from doing certain things. No, I'm not advocating that Windows is "better" because of this. I'm saying the article is so blatantly anti-Windows that it's hard to find an actual point that isn't colored with agenda.

Just yesterday I spent 20 minutes trying to change the format of the desker/pager in FVWM2 even after searching google and finding various suggestions that didn't work at all. I had to edit not one, two, or even three text files, but four different text files to make the desker/pager a different number of rows and columns with custom text instead of "1", "2", etc. Yes, Windows is missing the desker/pager, and it desperately needs to be added, but I assure you if and when it is added, you will right click on the desktop, go to properties, and then click the desker/pager tab and format everything in one place. It's a logical progression that a novice can figure out instead of looking in ~ for hidden files and tracing the path to other hidden files to edit an obscure line of text that has zero based row and column settings. Programmers get that, but 99% of people aren't programmers. Windows is a product for people who want to sit down and not think, just click, and get work done. Linux is nothing like that no matter how badly you want it to be. It's a trade-off between restrictions and advanced use. I am more restricted in Windows because it is trying to protect me from myself, which is absolutely necessary for the average user. I can do more advanced things in Linux because it is a multi-user OS by design and does not employ anywhere near the level of "keep Average Joe from executing rm -rf /*" by accident.

Unix, at conception, was never intended to be anything other than a simple way to share programming tools. That's why it used to be called PWB (programmers workbench). Windows was developed specifically to sell to the masses. They are fundamentally different tools in almost every aspect of the word. Would you argue that a hammer is better than a screw driver? They are both tools capable of being used by most people, but they serve fundamentally different purposes. Both OSes can do similar things, but that doesn't mean everyone should treat them as equivalent interfaces to whatever they are trying to do with the computer. I am more than a hobbyist - I am a professional programmer and electrical engineer who uses Linux to make a living. I wouldn't touch Linux with a 20ft pole, a 2x extension of David's, at home because I have no time for the nonsense it forces on you. Yes, my Windows machines have their own set of issues, but guess how hard it was to share media for my wife between her phone and Windows Media Center? She turned it on and that was it. I added the laptop to my home network and instantly had access to all of the documents on my Windows Home Server. Huge colorful buttons popped up with instructions my two year old could follow to set all of that up. It took less than a minute and has never had an issue since. The number of features I am losing is certainly long compared to a solution in Linux that would give me full control, but that isn't what I wanted. I wanted easy, quick, and stupid proof. Windows delivers that.

"Part of the problem is that since Linux has always been a favorite operating system for hobbyists and programmers, people assume that you must be a hobbyist or a programmer to use Linux."

That's not true at all. People don't switch because there is no compelling reason to switch. The only people complaining about Windows are the people who have a more advanced understanding of the computer and why a certain behavior annoys them. By definition, these people are programmers and hobbyists. My mom doesn't care about using regular expressions to save 8 seconds formatting an email.

"Whereas the proper response was to realize that although most hardware Just Works with no hassles under Linux, this particular piece didn’t. The solution would be to simply exchange it at the store for something that does."

I'm not sure how you can keep a straight face after saying something like that. You basically shot your whole article in the foot. If it doesn't work on Linux, there is no way to make it work, so you tell people to drive back to the store and try again. The vast majority of people, including those experienced with Linux, would never use it at home if you had to drive to Best Buy or wait three days to get something from Newegg only to have it not work and then drive back or send it back. EVERYTHING works in Windows or OSX because > 95% of households use it and will continue to do so for a long, long time. If either of those OSes has an issue, there is a solution that does NOT require you to drive back to the store or wait a week for an exchange. That doesn't mean it's always easy to install because sometimes it isn't. But there is no guessing game. Every box in every store has a list of which Windows and OSX versions it works with, and that's the point - you don't have to search for IF it works with them. You have to search for WHICH versions it works on and even that is usually meaningless because forward compatibility is typical. Also, not everything "Just Works" in Linux. Not even 50% of the hardware in my house has support. Half of it barely works in Windows, and that's with actual support and a phone number to call when I run into an issue. Who do I call when my printer won't install in Linux? No one at Dell is trained to handle that, so if I can't figure it out or take the time to write my own driver, then I have to buy a new piece of hardware. I couldn't get my printer to work in Windows, so after enough emails and phone calls I got support from an actual programmer and they made a bug fix in their software. The only chance you have of getting a bug fix in Linux is if some guy in his mom's basement decides to respond to a help request on a mailing list, and that's assuming you can even find the right mailing list. The average user would have absolutely no idea how to even initiate that conversation. You should try calling Apple or Microsoft if you have an issue. You play the hand-off game and talk to several people, but you can get your problems addressed. If it's a pressing issue, you can pay and get immediate support from someone who was directly involved in writing the software. I've done it. That option does not exist in Linux, so even if it's an emergency, you are SOL.

We have a group of 30 programmers in my building dedicated to Linux support. We have zero for Windows support because we use the Microsoft support line for those issues. My company is a 50 billion dollar outfit, so it is in our best interest to pay people to sit in a cubicle all day figuring out why things are going wrong in Linux. You can't do that at home, so it's as you said - take it back and try again - which most people aren't willing to do. When Linux doesn't require you to take it back and try again, then maybe it will become more mainstream. Linux is not better. Sorry.

I'd just like to point out to both of the gentlemen giving examples of linux being hard that they're using non-long-term-support releases.

Not being able to use proprietary software that you're accustomed to *is* a perfectly reasonable "this doesn't work for me" issue; but it's hardly a slam against Linux itself being "easy" or "difficult", either one.

Regarding Linux "requiring" you to take [hardware] back - it doesn't, of course, any more than Windows does. You're free to dive down the rabbit-hole making any given piece of equipment with poor driver support work, just as you are in Windows. The difference is the clear delineation between "supported and works" and "unsupported, maybe you can eventually make it work, maybe not."

I never said anything about the release I am using, so you basically fabricated that statement. Not like it matters, anyway, because my original point of every distro having subtle, and not so subtle, differences still stands. I also cited that as a problem with Windows and OSX, but at least those OSes have widespread support for a platform with far fewer variables, which is by design so the average person at least has a chance of figuring it out. If it doesn't "Just Work" in Linux, you are basically hosed if your computer knowledge stops before the level of IT professional.

This article was about Linux being hard at first, and then it turned into Linux being the better choice. I never slammed Linux for not natively having support for my printer. I cited it as a reason that it is not a solution ready for the average person, which was the entire point. You said yourself that the difference is between "supported and works" versus "unsupported and might work eventually." I don't even understand why that is a discussion point - we are talking about an OS ready for the masses. Supported and works is effectively the only choice.

There is no driver support for a huge amount of devices, so that is in fact the entire issue of being required to take hardware back. The article suggested that taking hardware back is a reasonable solution and I was debunking that as not being a reasonable solution for anyone at all, much less a novice.

I feel like you missed the point. No one said Linux sucks or anything close to that. David and I were giving examples of why the article is wrong to come to this conclusion: "And that makes Linux the better option—not the more difficult one."

You said yourself it is much harder than it should be, so obviously you understand. Linux was not made for the masses, nor is it ready to be treated as such.

I was referring to your comment regarding difficulties using Fedora - Fedora vs RHEL being the difference between "bleeding edge" and LTS.

EVERYTHING is much harder than it should be right now in IT, not just Linux, or Windows, or Apple - this is largely because computing, and particularly personal computing, is still a fledgling industry. (You might be initially inclined to scoff at this, but - obligatory car reference here - compare a Studebaker or Packard to a modern car. That's a BEST case age-of-industry comparison; it gets worse if you start from the first PERSONAL computers rather than the first industrial ones.)

"Linux is the better option" is a personal opinion, not a statement of fact, and it was certainly too broad - either Windows or Apple are better options for a lot of people and for a wide range of reasons.

I personally don't believe hardware support is really one of them, though; there's plenty of hardware fully supported by Linux and it isn't more expensive than unsupported hardware. When you get right down to it, very little hardware is *supported* by Microsoft, and even less (a bare pittance) by Apple - you're left at the mercies of whatever random vendor happened to put out the hardware, and the experience is frequently a long way from good, much less consistent.

As far as support goes... it's a false dichotomy to say there's little or no support for Linux; in fact you can purchase support contracts from Linux vendors just as you can purchase support from MS or Apple.

Linux will probably never be ready for the masses. In fact, Linus the-man-himself alluded to this.

He was giving a lecture to some computer club (outdoor sunshine in California, I think). Question from the audience about Linux in 50 years. Linus responded with his usual humility. He didn't know. He delivered one passing conjecture with sort of a shrug. Maybe the Linux kernel would simply become documentation. That made some sense to me.

How might one meaningfully separate issues of familiarity and true ease of use?
I've set up computers running GNU/Linux for older individuals with no real tech expertise. They didn't know if/how/why they were different than MS machines. (And there's the added bonus of relative immunity to viruses... Yes, they exist. But really? Have you encountered one in the wild ever?)
This is rehashing some familar arguments. When it fails in MS, it's acceptable, we're used to it. Whereas when it happens with GNU/Linux, it must mean GNU/Linux isn't ready for the desktop.

I quit using MS at home about 7 years ago. When I use MS now I find several things a bit confusing and annoying. Programs seem to be installed in arbitrary places. Finding software and drivers is a bit of an Easter egg hunt. Many people are used to it, so they don't complain. I find signed repositories of quality software to be much more comforting,reassuring,trustworthy, and preferable.
I've had sound cards that worked out of the box with Linux require a driver search in MS.
I've had MS machines on the LAN that could not / would not work with the network printer (and there was an in house IT MS guy who didn't get it working either).

As to whom one might recommend Linux...
For "grandma" - GNU/Linux - make sure webpages, Java, Adobe, and multimedia plugins work and forget about it.
For "Aunt Jane" -- well, maybe she's more familiar with Windows. You may hear the incessant "Where's Windows Media Player?", "I miss IE", or "This website that requires ActiveX just won't work, what's wrong with this program you put on my computer???"
There are a lot of conflating variables - to whom can one turn? what do ones friends, acquaintances use? Does one cope better with commonplace problems that others accept out of hand better than novel errors that lie outside of one's experience?
To meaningfully discuss this, it seems that one would need some sort of metric to differentiate, tally, and rank ease of use factors in contrast to familiarity/acceptance factors.

While the comic is an exaggeration, the author is attacking the exaggeration but in my opinion, is missing the barb completely, and it is a barb at the entire OSS community where "./configure;make;sudo make install" is considered a valid and viable installer for everyone. It isn't, and that is the problem.

The assumption that every installation also has a all of the tools to build software is not always valid, nor should it be. Sure there are repositories around to deal with this issue, and they work, but culturally, there is a stigma that goes with using Linux and not installing from source that validates TheOatmeals points.

While it is no longer technically required to know C and auto tools to install Linux, there remains a social/community stigma against those that can't or don't.

I'm not sure why you think compiling from source is still considered a "viable installer for everyone". Have you even *installed* a Linux system in the last five+ years?

I haven't needed to compile non-proprietary software from source *one* *single* *time* since switching my own workstations from Windows to Ubuntu seven years ago. Similarly, I have not compiled any software whatsoever from source on any of my production Linux servers in the same timeframe.

Not only have I installed Linux, in the last 5 years, I've done so repeatedly. There are many packages that you cannot stay current unless you build from source. Others that are hit or miss as to the availability and usability installed from the Repo options, and others that simply don't work. You can be as disingenuous about this as you want, but the vast majority of OSS projects distribute source, which it then falls to repo maintainers to update in the repos. That delay can be be months, if ever.

Take a case in example, I spend a lot of time monitoring a popular OSS RDMS IRC channel. Not a week goes by where there isn't an Ubuntu user having problems with a non-source installation. Other distributions have fewer problems but are generally at least one release behind the current.

You can argue that this is the exception, but my experience is that it is the norm. I doubt I am unique in this, and I this is without bringing up the subject of needing to make good hardware choices based upon known good Linux hardware, versus bleeding edge hardware that requires modified Kernel or make deps...

I'm also a little curious as to exactly what you consider "staying current" on packages means - by most definitions, that MEANS "current" with your distribution's repositories. If by "current" you mean "absolutely to the last decimal point with whatever's on the vendor's site" - well, by most standards that isn't "current", that's more like "beta".

The repos aren't just there to compile your code for you, they're there to provide QA (distribution-related and otherwise), integration, and security testing. They're also there to provide security backports specifically *to* older versions of software, in order to provide a more stable environment in which you can build more software to reliably depend on a given environment.

I am not being disingenuous - which I'll note here is a synonym for "dishonest", and a pretty offensive accusation to make. I am, I repeat, sitting on a total of zero non-proprietary packages compiled on Debian or Ubuntu Linux, either on workstations or servers, for the last seven years.

Being "behind current", when "current" is defined as "the very newest version available on the vendor's website" is a FEATURE, not a bug.

You should look up the definition of the word disingenuous before you get all hot and bothered by his "accusation", which, by the way, fits you perfectly based on your responses to this article.

Case in point: I don't know why you think you are qualified to make the call about something being an exception or not. His experience mirrors mine very closely, so I guess I'm an exception, too. The real issue is that you have no data to backup your claim, so you expect us to take your word for it that our experiences are somehow unusual. That's pretty much the textbook definition of disingenuous.

dis·in·gen·u·ous/ˌdisinˈjenyo͞oəs/
Adjective:
Not candid or sincere, typically by pretending that one knows less about something than one really does.
Synonyms:
insincere - false - devious - hollow-hearted

As to why I believe I am "qualified to make the call" about something being an exception or not: I'm currently maintaining 7 personal computers and somewhere north of 50 Linux servers, none of which need custom-compiled software (unless you count in-house developed stuff). This includes sites in the Alexa top 5,000 - which I mention only to avoid the inevitable "there's more to the world than your personal blog site" response.

Very, very few people *need* the absolute most bleeding-edge version of a given piece of software... even developers who are building their OWN software using other packages as dependencies. The fact that so many rush for it anyway in no way changes this.

I think Dru hit the nail on the head with why this "rush to new changes" seems to happen. Except in extremely well oiled teams that actually maintain release branches and apply bug fixes to them (which tends to happen only with very large pieces of software), it is much much easier for developers to say to users "just use the new version, all the bugs you are complaining about are fixed there." Another reason is that developers use the latest version of big packages because it is again much much easier than supporting various versions of a package. So my point here is that the "grandpa" user might only use big big software (firefox, etc), but as you move only slightly up the spectrum of "power users", the tools and packages they like to use become smaller, more focused, and therefore (not necessarily but usually) less "well written and maintained" (in the sense that they don't worry so much about versioning and dependency versioning). These are very likely the same group of people that might visit this site, and are therefore complaining about your original statements about easy of use :)

"There are many packages that you cannot stay current unless you build from source."

So the fact that you *cannot* "stay current" with the latest source on a proprietary vendor's internal source repositories is a *feature*, but the fact that it requires some work and technical expertise to build the latest source on an open vendors public source repository is a *flaw*? You have some wierd ideas!

I have occasionally built from source, after fixing some bugs that were annoying me - and was very thankful that I had the option! If you aren't technical, you pay someone who is to do that for you if it is worth the money to you (personally or businesswise).

In the Bad Old Days, I had to patch binaries to fix bugs. Ugh! Even with premium support contracts, vendors never seem to get around to your problem - unless you are giant corporation.

> but the fact that it requires some work and
> technical expertise to build the latest source on
> an open vendors public source repository is a
> *flaw*?

Well, it's a "flaw" if it's any harder than it needs to be to actually compile source - which, IME, isn't generally the case.

It's a "flaw" for the vendor, not for the distro, if the vendor's *goal* is to directly support end-users, because if the vendor's doing that, then the vendor should really either be maintaining their packages in major distros themselves, or should at least be offering friendly binary packaging (RPMs or .debs) on their own site.

> I have occasionally built from source, after
> fixing some bugs that were annoying me -
> and was very thankful that I had the option!

Options are always good, and I sincerely hope nobody's arguing with that. I've fixed things that bothered me in source also, and then submitted back upstream. Once or twice, my hack-handed fixes were even accepted. =)

I don't mean to imply dishonest, I mean that you are simply saying that your normal is the general normal, which I do not believe that it is.

You readily admit that you are narrowing your usage parameters in both the original article, and in your subsequent replies to commenters. Your are using 'Long-term support' versions of Linux, as well as being fine using non-current versions of software.

Look at your 'official' repository for PostgreSQL what is the current version? Is it current to the current version from PostgreSQL.com? where is is packaged by PostgreSQL enthusiasts (It wasn't when I last had Ubuntu installed, which has been about about a year, I tend to prefer RHEL).

As for your contention about QA, I can only assume you view QA as 'older software where the bugs are documented' rather than 'current with different and less documented issues' because I don't see any repo making noticeable changes to OSS core packages in the name of QA. They are simply packaging up code, as it existed at a point in time, or sticking with older code because of a comfort level.

I don't deny that if your usage is narrow in scope, you do not have a need to stay current with software, and you have good research skills with regards to hardware purchases, then you can live without building anything from source in the Linux world.

On the other hand, if you pop into an IRC channel or mailing list for support and ask the residents for help with something, the first thing you are going to be directed to do is get current in most cases. Then they will help, and if required, often make code changes for you, but you will have to be ready to use SVN or GIT, ./configure and make to get it installed.

The fact that you haven't been there is amazing, and a testament to the improvements in the situation. Yes, the repo's have come a long long way in the last 5 years, but IMO they still have a ways to go, and yes, I believe that until binary distribution is the accepted norm among developers, not just the distributions, the problem of 'consumer' linux doesn't go away.

Thanks, I appreciate the clarification... but urge you to read the copied-n-pasted definition above; "disingenuous" is a pretty specific word.

> On the other hand, if you pop into an IRC
> channel or mailing list for support and ask the
> residents for help with something, the first
> thing you are going to be directed to do is get
> current in most cases.

No arguments here! But which channel are you popping into? If it's a channel maintained by the developers of the package in question, of *course* they're going to want you to install their most current version. If it's #debian or #ubuntu, more frequently they're going to want you to apt-get update && apt-get upgrade.

> I don't see any repo making noticeable
> changes to OSS core packages in the name
> of QA.

Sorta depends on what you call "QA", I guess - repos very frequently backport security fixes to older versions of software, and similarly they very frequently make integration-related changes. The distro maintainer also has to properly set dependencies and conflicts, and make patches to account for locations of required files, et cetera.

One question you might want to ask yourself is *why* do the distributions backport security fixes to older versions, rather than just compiling the newest version and tossing it into the repos willy-nilly? It's clearly a lot less work just to compile whatever's on the vendor's website than it is to actually go through and patch older versions... so why don't they do exactly that?

The answer is *to provide a more stable environment* for the people who are using the distros. When you keep introducing new version after new version straight from the vendor (author if you prefer), you're also more likely to introduce new bugs, and to break functionality that software using the package as a dependency might rely on. All of which increases the cost of maintenance and the frustration factor for the mythical "typical user".

This gets pretty frustrating when you are in the mindset "well I want to do the newest possible thing with the newest possible things and I JUST WANT EVERYTHING NEW!", but on the other hand, when you're in the position of maintaining a stable environment yourself - whether it be your family's small network or your business' revenue-producing server farm - you quickly see the benefit of a more organized and predictable release schedule that you can plan around.

Multiple definitions...I marked the important one since you are having trouble.

You are speaking from a false position of knowledge. "North of 50" is not even statistically relevant, nor impressive. Trying to convince people that you have actual statistics or lengthy experience to leverage is disingenuous at best. I think he was being nice because I would have used another word entirely.

This is a question of the definition of a "distro", not statistics. A "distro" is a packaged and integrated product, so that no manual compilation is required. By definition. Even distros that compile during install (e.g. gentoo) do so automatically, so that the end user doesn't do anything extra (it just takes longer). So without any sample data at all, I can state authoritatively, that if you are using a "distro" and are manually compiling stuff from source on a regular basis (or at all) other than while wearing your developer hat, you either need to find a better distro, or are Unclear on the Concept.

You pretty much missed the entire point of my post and seemingly went off on a tangent about something I never even inferred. This has nothing to do with the definition of the word distro. It's about the average experience of installing, maintaining, and generally dealing with Linux. That is all encompassing and is certainly not specific to any one distro or one type of installation. I also never said anything about compiling from source in any of my posts. I love arguing on the internet with people who put words in my mouth and then use them as leverage in their next point. However, even if I had said anything remotely like what you are suggesting, you can't authoritatively say anything because you have absolutely no idea what I do on the computer, nor do you have any data suggesting you know what the majority of people do on their computer in any OS.

As an aside, why would you capitalize the word Unclear and Concept? You didn't give them any extra special meaning by randomly Capitalizing a Letter.

I will stick with the environment I can relate to the best, but let's take for example your RDBMS of choice. You install your distro, and start working. You run into a bug and hop in the IRC support channel for said RDBMS. You explain your problem and the response is, "what version are you running?" . When you respond, it's the version that is current in the distro repo. Say Ubuntu, and PostgreSQL 8.4.2. Pretty much EVERY single time this happens, the first bit of advice will be update to whatever the current stable version is.

At this point, you are no longer in repo land, you are now in developer land.

Now, don't get your knickers in a twist because this never happens. It does, with great regularity, and it helps fuel the Linux is hard stigma. The harsh reality is that in other worlds the response is either wait until the next update (eg, wait for the Repo update) or TSOL.

The point is this, the very things that make Linux and Open Source attractive to many are the very things that fuel the perceptions of it being hard. Either Linux is different, and hard, or you are using a Dist and staying within it's walled garden, and giving up much of the core value that Linux brings. Arguing that Linux is not hard is fine, so long as you qualify it with 'so long as you stay within the limits of your distro'. But to say that Linux on the whole is not hard, and does not have warts is tantamount to putting on the proverbial rose colored glasses and ignoring anything that distracts from what you want to see.

In that world, Ubunut, RHEL, et al are no different that OS X, or even Windows. You are bound to what the vendors support, on their time table, and you may well have to wait until that 'next release' to get your fix.

"You install your distro, and start working. You run into a bug and hop in the IRC support channel for said RDBMS."

Normally, you go to the distro support forums for help with the RDBMS packaged with the distro. If the distro doesn't actually support the RDBMS, and you need support, maybe you should be using a different distro. If you really want support directly from the RDBMS vendor, use a distro they support. They probably also offer paid support for other distros, where they will supply you with a distro repository for their products. Even well packaged proprietary products, like adobe flash and reader, have their own repo.

Thank you for confirming the second half of my post without actually saying anything about it. You are saying in effect, that the 'Distro' is the root of all support for all package availalble through it's repo.

Welcome to the walled garden that most F/OSS proponents spend so much time and effort trying to break. You have now traded one corporate overlord for another, only now, you have not only said that the distro provider is responsible for their work (the core distro) but also the work of other (the F/OSS packages that they are hosting in their binary repos)

Fine, I'm doing it wrong, I can live with that.

FWIW, I've been maintaining binary installations of OSS packages for a few platforms for about 10 years. It's a pain in the arse to deal with both sides of this equation. It is unreasonable to expect the Distro maintainers to be able to support the esoteric issues of every package that the redistribute via their channels.

> You are saying in effect, that the 'Distro' is
> the root of all support for all package
> availalble through it's repo.

Nobody's said that the distro is the *root* - but they are certainly the first step, and nowhere have you mentioned going through your distro's support at all - you're just skipping straight to the vendor as though there were noplace else to go.

Also, I've been patiently trying to point out that you generally *can* get very current indeed, by the developer's standards, without having to leave your distro's repos entirely - or (in the case of Ubuntu and its PPA and Postgre specifically) at the very least without having to leave ALL semblance of a shared, standard codebase.

It's actually MORE difficult to support a user (and in this case, developers count as "users") who's gone off-reservation and built from source, because then you get to wonder WTF they did in terms of compile-time options, possible weird versions of dependencies, etc etc etc - whereas if you know your user has installed a binary package from somewhere standard, you already *know* the answer to all those questions and can proceed to more interesting things immediately.

So, what I'm saying is "doing it right" is:

* install from repos
* if you have a problem, go through your distro's support first
* if your distro's support can't help you, THEN go to the vendor
* check backports and well-known PPAs for packages new enough to satisfy the vendor, if the vendor requires an upgrade

Oh I understand what you are saying, and while I used PG which is about as major a package as there is so it is current, the example holds true for lots of packages that aren't of that size. Further, you have to know enough to look outside the Standard repo to get current with that. Be that as it may, the point you are making is this.

Linux is easy if you stay within the sanctioned and approved packages supplied by the distro vendor, and should you need support, the distro vendor is your first line of support.

So, I ask, how is this better or even different from iOS and it's maligned Walled Garden?

The app store as the repo, apple as the distro vendor, apple as the primary line of support. Oh, wait, apple isn't the first line of support, the developers are.

Honestly, at this point, your own arguments make the walled gardens sound as good or better than the Open platforms. You have established that to be easy, even on Linux, you have to stay within the walled garden, thus validating everything that is usually used to invalidate the closed platforms, as well as the core points in what you are trying to debunk.

Linux is easy, if you don't leverage it's freedoms. If you get out of that garden, well, things get dramatically less easy, but let's not acknowledge that because Linux CAN be easy.

Vote up!

0

Vote down!

0

Pages

I'm a mercenary systems administrator located in Columbia, SC. My first real hands-on experience with open source software was running Apache on FreeBSD webservers in the late 90s and early 2000s. Since then, I moved on to Samba, BIND, qmail, postfix, and anything and everything else that grabbed my attention.

I currently support Windows, FreeBSD, Debian, and Ubuntu workstations and servers doing just about everything that you can possibly do with any or all of them. RAH said it

Main menu

The opinions expressed on this website are those of each author, not of the author's employer or of Red Hat.

Opensource.com aspires to publish all content under a Creative Commons license but may not be able to do so in all cases. You are responsible for ensuring that you have the necessary permission to reuse any work on this site. Red Hat and the Shadowman logo are trademarks of Red Hat, Inc., registered in the United States and other countries.