I use xmonad as window manager, which AFAIK is not installed out of the box on any distro but on ubuntu takes me around 2 minutes to set up (using existing.files). I use it mainly for programming but also some gaming, but I never have bleeding edge hardware.

I figure I could just as well use debian or mint, I don't really think I would notice any difference as long as the package repository is reasonably stocked. I use ubuntu because it is my first linux experience that I used for more than a week, and beca

FFM is not actually fundimentally incompatible with focus-follows-mouse. Gnome 3 works around it by providing an option called 'focus-change-on-pointer-rest'. It works extremely well on a trackpad because you general lift your finger once the pointer is over a window. With a mouse, it gives a slight lag because your hand isn't as steady.

Why does this work well with global menus? Because when you use global menus, you throw the pointer to the top of the screen, using fits law.

The reason I use FFM to begin with is because I hate having to aim and make sure I hit a tiny widget or make sure I don't accidentally click a link on a webpage when trying to give focus to the window. Having menubars in windows is an extension of that problem. I would probably care less if mouse motion was actually one-to-one, but it isn't.

Fitt's law is bullshit however. It's stupid to throw your mouse against the top of the screen to access a menu. It's a lot smarter to reserve a button on the mouse (maybe the fourth or fifth, eg to use with your thumb), and have the menu just appear where you are. It's faster and there's less focus switching overall. Basically you get a "menu" button on your mouse, and your eyes get to stay looking at the place on the screen where they were looking before.

When using a laptop trackpad it makes no sense at all because of how the motion tracking works.

It makes even less sense when using a touch interface, where there is no "throw" action at all. With a touch interface, the controls should be as close to the object they are manipulating as possible so your eyes don't need to move.

I don't like FFM and I don't like global menus. I want a menu to be close to the window it belongs to. Currently, when there's a little shell window on the lower right of my big screen, the associated menu is in the upper left, and I need to make hugs mouse swipes to get to the menu and back to the window. Why did they change the way Ubuntu works when everybody was more than happy with the way it worked?

Apart from being unusable for us Fcous-Follows-Mouse guys, it's kind of an "I told you so", because I never could fathom why I would need to mouse all the way to the top of the screen to find the menu of a window.. It always was user-unfriendly.

I've been mostly fine with the UX of Unity, but it really is a damn laggy and slow desktop, and also buggy as heck. I thought Canonical had the resources to set things straight but the quality assurance is just horrible. The Fedora KDE spin is my current happy place in Linux world.

I've never understood why we can't get the window-manager and the application to play nice, and share one bar. Usually, there's plenty of space horizontally, and too little vertically. So, why not have the combination of:[icon] File Edit View History Bookmarks Tools Help....... "The window title goes here"....... _ [] X

If you click the title bar to move the window around, the area you have to hit would be smaller (must avoid menus) and would vary from one application to another due to differing number of menus. I don't know if that's the "official" reason; it's just a hypothesis.

You could still drag on the window title, and if collapsed, drag on the icon. If that's too small for some people, they could always include an option where dragging on the menu bar text would drag the window.

I've been okay with the dash and the side bar look of new ubuntu. It's mostly been the same for me. I switch between different desktops all the time, so I'm not particularly attached to any one or the other as long as it doesn't really impede my workflow.
What I hate and still can't get used to is the global menu. I accidentally close out of so many applications because I don't realize I'm actually focused in another window. It annoys the piss out of me, and takes away the concept of the window. The window is it's own little self contained world. Menus for that window should be with that window.
I still can't get used to clicking for focus on a window, and then dragging my mouse all the way back up to the top of the screen to get a menu for a window. It really only works well for a full maximized applications.

There actually are some good reasons for going with a global menu bar. When developing the original interface for the Mac, Apple studied the various options for the menus in depth. What they found is that when the menus are at the top of the screen, they are significantly faster to access, as they have infinite depth, thus you do not have to be anywhere near as accurate in your pointing to access them. In effect, you only need to have to worry about the left-right position of the cursor, as you can just

And yet it is a terrible violation of Fitt's Law, especially on large high-res monitors and multi-monitor setups. Not to mention that accessing the menu of a non-focused app requires dragging the mouse over to that window or dock icon to click for focus and then dragging the mouse all the way up to the menu bar and then back down to the window to resume work. I should install a mouse-odometer app on my Mac and my Linux box just to see how much extra movement Mac OS requires. After years of working with all three major OSes, Mac OS has quickly become one of my least favorite.

I work on dual screen 1920x1200 at work and I use a 27" higher res monitor at home. Mousing to the top left of the screen in MATE doesn't require more than an inch of mouse movement with my settings on any of these machines. (medium sensitivity and a tiny amount of acceleration).

What is your mouse sensitivity set to? I'm genuinely interested rather than trying to troll you. Perhaps the aversion people feel to a global menu is something related to personal mouse sensitivity preferences and those of us who ne

I'm quite certain that Apple made a good decision when the Mac had a teeny tiny black and white screen and every pixel of vertical space was precious and when people were essentially single tasking most of the time. And even when Finder let people multitask, apps jumped to the front in an all or nothing way so it was very clear which one had focus. Delegating the menu to the OS made sense in that context.

It doesn't mean it applies as screen sizes increase, or where users may use two apps side by side, or

Diclaimer: I use Linux every day for work. I use Ubuntu 12.04 LTS. I don't use Unity.

The usability problem with Unity menus is not that they are either local or global, it's the fact they they disappear every time you take your mouse away from them, please don't make me have to mouse over the window title to get the menu to appear. While this sounds simple enough to do, it causes you to haltingly mouse over the general area of the menu bar, then wait for the thing to render, then visually locate what you want, then mouse over it and click. In the good old days, one could just mouse over to the precise menu location and click-it in a single move

Unity now provides the user with a choice as to whether they would like to break your menu in either a local way or a global way, sadly the problem still exists. Please stop breaking user interfaces with stupid design!

For the record, I use MATE as my desktop because all this new fangled sausage-finger friendly crap is simply not a productive place to work

Global menus are fine on low resolution, space constrained screens. For example a netbook might such low resolution that a user appreciates combining the system bar, the app frame and the app menu all into one strip instead of 3. And the chances are on that size of screen that I probably have one app open and maximized the whole time more often than not.

But the bigger the screen, the stupider a global menu becomes. Users more likely have more than one app open at a time, and they're more likely to be unma

That's true. Unity is basically building on what was started with Ubuntu Netbook Remix which had to work on very constrained screen sizes. They'd be better off makiing it more dynamic, so it picked an apppropriate default based on the screen size but let itself be fixed to a particular mode by flipping a few toggles in the options.

It still saves vertical space to crunch those bars up instead of 3 strips - system, app frame and app menu, it's combined into one. I actually think it works extremely well for netbook or similar. The problem is it sucks balls for larger sizes.

GNOME suffers the opposite issue in that it's very wasteful of space even when apps are maximized. It doesn't even do anything with the frame when maximized even though it probably could. I think the expectation from GNOME is that apps will start using client side d

Linux users like what they want. That might not conform to whatever your personal preferences are or what's trendy. That doesn't make Linux users "luddites". It makes them something other than mindless drones.

Beyond that, going out of your way to try and copy that other marginal player in the industry us just retarded. You will pretty much ensure that less saavy users are alienated by something that seeks to be annoyingly different for it's own sake.

You think Linux users are luddites? We're not even close to that compared to the bulk of the potential users out there.

Trust me, I played for hours and at the time it was fun. Not so much any more.

(emphasis mine; not as OT as it looks.)

I started on Linux in 1994 - Slackware, wasn't it? I had to port nearly everything from SunOS or HP-UX or AIX or Irix, or whatever. It was fun working with it, and fun watching it grow.

But eventually I got tired of constantly having to futz with everything to make it work. It was not unlike having to tune up your car every time you went for a drive around the block, and having to replace the engine and tires every time you wanted to drive across town to see grandma. A

In 2010 Ubuntu was the best shot the Linux community had at getting serious market penetration into desktop and laptop computers. With GNOME 2 it wasn't as pretty as Windows Aero but the user interface was similar enough that the switch was easy for regular home computer users.

Then Canonical switches gears to Unity. The first few releases were very buggy, and even after it was quite stable the user interface changes annoyed people. So Ubuntu ceased to be the default suggestion for a Linux version to

I'm still using a physical keyboard because it's better than a touch keyboard. The Windows 8 interface was an unnecessary and inconvenient change and yes I know you can do X, Y and Z to make it less annoying but then what was the point of the change? It hasn't improved anyone's experience and just puts extra, undocumented steps in that confuse everyone, even the techies. That goes double for Server 2012 where Metro is a completely unnecessary nuisance.

Yes, because anyone who questions your viewpoint and politics is obviously an ignorant luddite...and liberals wonder why others perceive them as arrogant, totalitarian, histrionic, narcissists. Tolerance and diversity only applies to their own viewpoints and protected castes, I guess.

Global menus work ok for small desktops (1024x768 tops), but with huge desktops that have multiple windows side by side, having to select the window and move the mouse to the top of the screen to use the menu for it is a pain.

Intelligent users like configurations that work for their workflows. When they are obviously changed out just for change's sake, they become irritated. This applies to any platform. Change for change's sake has become a fad in the last 5-6 years, and it's driving people nuts.

How the fuck can people take a discussion about a UI element to fucking politics? It's not even a good political discussion, it the same stereotyped shit we read about everyday, where there are only two fucking views on each subject and they are both ludicrously inflexible. I thought for certain no one would fall for the obvious, weak flamebait of the first post, but lo and behold, the discussion has degenerated into things like

Of course, that's why you elect politicians who'd love to stamp out free speech, right? To make it my problem, and make up for the fact your arguments are without merit?

Come on, tell me the truth: you guys disliked the new design and are poisoning the content, too, so as to encourage our transition to a better place, right? Because no one can be that unproductively disruptive unless on purpose.

haha, MY attitude was rude? I suggest you reread your post. I simply responded in kind. Yes, of course, you should always assume people who disagree with you are trolls. That way, in your own mind at least, you won't have to face whatever challenges were brought against your views. While it denies you the chance to learn something new, at least you can stay in that nice self-absorbed realm of petty narcissistic solipsism. You do realize this attitude lives up to that negative liberal stereotype I ment

Ah, and then you one upped yourself with a spineless fairy comment. Dude, take it down a notch. You're being a bit of an ass, and your own biases are showing a bit. You're also displaying the exact behavior that you're accusing them of. After all, only spineless fairies rush to silence criticism of yourself or others, right? I realize this might offend people like you who can't handle criticism, but that's not their problem.

I don't even know why you brought politics into it.... Anyway, what they said wasn

I'm someone who wants change. For example, I think Linux is dated and I would ditch it for Hurd if that thing could ever work reasonably well one day.

I tried Unity, I tried to adapt to it, but it is, for me, a step in the wrong direction. Maybe it's because I'm using three monitors (one 27" and two 22"), maybe it's because I'm using too many programs, working on too many files and doing too many tasks, but Unity doesn't work well for me.

But the thing is I think Linux fragmentation is bad. I think Linux need

You can't have it both ways. If you want one strong leader, you have to go with their decisions whether you like them or not. If Canonical refuses to offer more options (because "they'd be too confusing to regular users"), then either you suck it up and deal with the poor workflow, or you abandon them and their "strong leadership" for a different choice.

"Strong leadership" is what lead to Soviet-style socialism, where everyone had maybe 5 choices for shoes and that was it. Freedom and diversity go hand-i

Why Apple has made the perfect UI, how could you not love the best design for DJs and Photoshoppers? There are about 10 things wrong with OSX and they are all random design crap Jobs picked -Global menus,Single mouse click,Left window controls (yay for all the left handed and left eye dominant people, boo for the other 95% of the world)Launchpad (how is the start menu missing causing a revolt and launchpad even exist? Launchpad is the initial SIN!)Finder layout straight out of system commander circa 1988.

Mac OS has supported multiple mouse buttons for at least 16 years. Even when using a now-extinct one button mouse, control-click presented a dialogue box.

Left window controls (yay for all the left handed and left eye dominant people, boo for the other 95% of the world)

Because it's easier to move a mouse up/left with your right hand, and was developed in a country that reads left-to-right.

Launchpad (how is the start menu missing causing a revolt and launchpad even exist? Launchpad is the initial SIN!)

The start menu missing is causing a revolt because Microsoft removed something and replaced it with an abomination. Launchpad - and other questionable features like Dashboard - can be completely ignored.

Finder layout straight out of system commander circa 1988.

Column view in Finder is optional, with icon and list view still available. Also, Finder has had its sorting options greatly improved throughout OS X's history.

Crap loads of docked icons you never use be default.

If you go and buy a Mac today, this is in the Dock:- Finder: File management- Launchpad: Access to all apps not in the Dock (And easily ignored, as previously discussed)- Safari: A web browser- Mail: Email client- Contacts: An address book- Calendar: A calendar- Notes: Short notes- Maps: A map of the entire planet- Messages: Text messaging and IM- FaceTime: Video chat- Photo Booth: Something fun to play with on your new computer- iPhoto: Something to talk to your camera- Pages: Word processing- Numbers: Spreadsheets- Keynote: Presentations- iTunes: Play and purchase music and TV/movies- iBooks: Read and purchase books- App Store: Install and purchase software- System Preferences: Change settings on your computer

The default Dock icons cover managing your computer, using the big two features of the Internet, syncing 'organisational' information with your phone, finding locations, messaging and video chatting with other people, photography, writing, processing numbers, creating presentations, watching media, reading, and installing an app to do anything else you want your computer to do. The default Dock is a slam-dunk for covering what the majority of people use computers for, points users in the right direction to add new capabilities to the computer, and is easily customised to remove the things you don't want. (Launchpad, again...)

The Dock is setup perfectly for you to get started with your computer. Anything else you need to get to can either be accessed through Spotlight (power users) or Launchpad (for people with more experience with iOS).

A separate contact and calendar app....

Just like iOS... but also NeXTSTEP; they have always been separate apps, which makes finding what are ultimately different tasks easier *and* they also seamlessly share the same databases behind the scenes.

General iOS crap

Integration with touchpads is great. Removing always-visible scrollbars removes needless clutter. Things like Launchpad - and pretty much anything else you don't like that reminds you of iOS - are easily disabled or ignored.

Hardwired application dependency locations (the whole point of applicat

Note it's been a while since I've used OSX more than some trivial playing with the newer touchpad in a Macbook Air, so I've refrained from commenting on more recent things.

This said, the post a couple above yours was specifically about *older* versions of Mac OS and I think that's still relevant.

Fitts' law indicates that the most quickly accessed targets on any computer display are the four corners of the screen

The problem with the Fitt's Law argument is it only makes sense if your computing experience ends with clicking that menu item.

For instance, if you now have to move the mouse to the window, it's now maximally far away from your cursor and not near a screen edge, and Fitts Law says you just made things a kazillion times worse.

And if you want to interact with two windows (eg. copy from one, paste in another, using menus), you've added another step to switch which menu is available. Admittedly, virtually the whole world has figured out the keyboard shortcuts for cut, copy, and paste, since those are some of the most universally useful commands.

This all means that hot corners and hot edges for the mouse should be reserved for the sort of interactions that are fairly universal between apps, and which logically terminate a sequence of actions. For instance, closing an app (debatable because of accidental clicking, but common), switching to another app that's behind the current app, that sort of thing.

Mac OS has supported multiple mouse buttons for at least 16 years.

It was supported but not really seriously encouraged until more recently than 16 years. But yes, it's an out of date argument now. Just...not 16 years out of date.

Left Window Controls

I don't believe either your argument or the GP's. I'm very skeptical that it's "easier" to move up and to the left with your right hand rather than up and right, which is directly away from you rather than going across your body. But frankly, a mouse is not hard enough to use to justify left vs. right in any way. Window control positions are basically arbitrary (so long as they are in a consistent place within the OS, eg. corner of the window as we've all settled on).

Touchpads are not iOS. I can see how they might seem related, but it is a fundamentally different interaction model when you're operating on a device distinct from the screen. Minimizing input delay is not as important, pinching takes on a different aspect, different opportunities exist simply because your hands aren't covering the viewport, etc.. Don't get me wrong -- I think improved touchpad support is great. I just don't think it has all that much to do with "General iOS crap". I guess maybe the fact that people were trained on iOS to perform certain gestures?

Scroll bars are NOT needless clutter. They are a visual cue on the amount of content on the screen vs the amount of content that you can't see. Right now with a quick glance I can see I'm only half way through reading the comments. I can't do that if the bar is hidden, and I'd need to do something like move the page.

I hate this on touchscreens as well but it's more forgiveable since any finger touching the screen will make the bars reappear. I can't do that while I'm typing on a keyboard.

I've read the question 5 [asktog.com] and its answers about global menu superiority.

I would like to emphasize this:
- I've been using Macintosh, Unix workstations, MS PC (DOS,Win3.1 up to Win8), Linux PC with various WM/Desktop, etc.
- Global menu was fine for me on Macintosh Classic 9-inch display, for any task.
- Global menu is painful and irritating on 24-inch display, for most of the creative tasks.

I suspect that this is not only a matter of how long the cursor travel though the screen, but also about how much you have to adjust your gaze on the area requiring your attention.

Fitts' law fails to address that point, even if you can do things quicker it might not be as productive if it's uncomfortable and tiring.

Regarding GUI, Apple has failed on several points with nowdays huge displays, for instance it tooks them years to allow window size adjustment on any border (instead of a tiny triangle on bottom right). The feature comes with Lion in 2011... That's a shame.

Application folders are as poorly thought out as Program Files was. Here's why:

Path inconsistency - Crack open a terminal and try to run your app from there. I'm sure environment variables have a length restriction, even if it's really long;-)
Lack of security - try to patch all those apps using the same non-core shared libraries, you'll have one hell of a time as you'll need to either manually copy files or wait on the vendor!
Disk space wastage - see above!:D
Power-user hell - let's say you can d

Mac OS has been like this since System 1. And it makes sense; whatever you're doing, its menu is going to be in the same place. Fitts' law indicates that the most quickly accessed targets on any computer display are the four corners of the screen.

First, the corners of the screen are the fastest to access merely because they require less dexterity. And your menu button must actually occupy the most extreme pixel to work for this purpose, and the menu buttons do not. Perhaps they meet the edge of the screen, but I'm not even sure about that (I don't think they do).Second, the paradigm of universal menu location is a violation of basic psychology. Our brains prefer that things which are related are grouped together. E.g. if I have two windows open side

If you're on a macbook, one finger is a "left click" and two fingers is a "right click". Using a keyboard modifier would be odd and uncomfortable IMO. Go to System Preferences -> Trackpad -> Point & Click to enable this if someone's disabled it.

For scroll bars, you can go to System Preferences -> Show Scroll Bars -> Always if you wish. I have mine enabled that way too.

Uh, ok, like another reply says, how often do you need to look at every application on your system?

Here are a few scenarios that come to mind. 1) Need to run a specific application.
a) Click the Dash, type a few letters, click the application or,
b) Click the Dash, click the categories view, filter by category, click the application or,
c) Find your application, drag it to the Launcher for quick access

The problem is choice.Users don't mind changing, but they want to revert or change thing that don't like. Ubuntu and gnome3 are 3 of the main examples where the choice is removed from you, because they "know better" and "it's too hard for normal users". This of course created rage among the "advanced" (or simply older) users, even more when most of the time the only solution is a radical change of distro/desktop environment.

What is good for one guy might not be for the next one, without a proper fallback, t

I always thought linux users were not afraid of change and welcomed the new. Sometimes I think some linux users are a bunch of luddites with strongright wing conservative leanings. Who would have thought.

Goddamn right! I hate to say it, but I refuse to use ubuntu anymore because it seems like every new release is a total clusterfuck of new half finished ideas. I never liked gnome all that much, but even going back to version 2 would seem like an improvement at this point. Linux devs need to work together and produce a consistent UI, but no lets instead have different flavors of X and a million different desktop environments. Because that's so much better. Could you imagine how polished the UI would be if yo

Is there any compelling reason for them to "stick" with something? Having the choice is a positive good. Unity's lack of options is what drove me away from it.

Muscle memory. There is nothing more significant to a good user interface than being friendly to developing muscle memory. Everything else is secondary. Once you develop muscle memory, you don't care much what it looks like because you don't look at it. If you can't develop muscle memory, you won't ever enjoy using the device.

That's why the many devices that are pure touch screen driven suck. They demand your constant attention like a mewling infant. The push to add hot spots and gestures and voice to all these touch screen devices is driven by this truth.

Is there any compelling reason for them to "stick" with something? Having the choice is a positive good. Unity's lack of options is what drove me away from it.

Muscle memory. There is nothing more significant to a good user interface than being friendly to developing muscle memory. Everything else is secondary. Once you develop muscle memory, you don't care much what it looks like because you don't look at it. If you can't develop muscle memory, you won't ever enjoy using the device.

That's why the many devices that are pure touch screen driven suck. They demand your constant attention like a mewling infant. The push to add hot spots and gestures and voice to all these touch screen devices is driven by this truth.

THIS! As a person who uses and supports OSX, Windows in various flavors, and Linux, I feel that I can at least make an informed analysis.

I have to do a lot of switching back and forth between various OS's, and trying to develop Muscle memory for Windows 8 has proven to be like trying to swim in Jell-O ® Even on a touch screen laptop, which allows it to "work" better, but is probably worse for power users.

Sounds like the reason why I give up Linux so often. I'm extremely quick with Windows 7 (when I have to be), to the point where I don't have to think - my brain wants an action to be performed, the fingers take over automatically. But when I try to use the latest linux distro or whatever, the lack of familiarity with the interface (even basic shit like Ubuntu moving all the window buttons to the left) throws me out completely.

I had this problem a long time ago. I wanted to learn Linux, but it couldn't replace Windows back then, and I didn't have any real world uses for it. Then came Mac OS X, I bought a Mac, started to use the commandline more and more. Then I changed jobs, which meant changing from Windows servers to Linux, and after one year I ditched my Windows desktop for Ubuntu. Now I'm more familiar with Linux than with Windows.

Changing the user interface is absolutely no different than changing the interface to a class, and the same design principles apply for similar reasons. The Open Closed Principle (OCP) states that a class should be open to extension, but closed to modifications. User interfaces are no different. They should be able to extend it to add new features, but they should never change the existing interface to provide for backward compatibility. The reasons are identical, as well: if you don't change it, nobody else has to change in order to keep using it.

The only valid reason you should change the interface is that you should remove the old interface if it was no longer needed because the tasks it did are no longer used. Clearly, that's not the case here - people still need to search, organize, locate and execute programs. Changing the UI was a completely counterproductive action, and never had any way to actually add benefit. Offering an additional UI for people who wanted a new UI would be a perfectly appropriate approach, yet they failed to implement that way.

Instead, they poorly copied Microsoft's actions with Windows 8 and Metro, which was itself a poorly done copy of iOS's interface, with the added insult of requiring gestures even on a mouse-based machine! Apple themselves then made a shit-poor decision to change the UI for iOS 7. Unity fell somewhere in the middle of this mess, believing that "change is good because Apple and Microsoft were doing it." So they violated the OCP, and pissed off as many users as they could. That's even a bigger mistake for them, because Unity users are far less locked into the choice of Canonical than a Microsoft or Apple user.

All in all, Ubuntu has made bad decision after bad decision once they started down the path with Unity. And they don't seem to understand this is a failure at every level; instead, they blame the users for being whiny luddites incapable of dealing with change. They're wrong about that, because I can indeed change, and it looks like Mint or Kali will be my next distro instead of the next version of 'stammering shuttleworth' or whatever childish name they're assigning to it.

Instead, they poorly copied Microsoft's actions with Windows 8 and Metro, which was itself a poorly done copy of iOS's interface, with the added insult of requiring gestures even on a mouse-based machine! Apple themselves then made a shit-poor decision to change the UI for iOS 7. Unity fell somewhere in the middle of this mess, believing that "change is good because Apple and Microsoft were doing it." So they violated the OCP, and pissed off as many users as they could. That's even a bigger mistake for them, because Unity users are far less locked into the choice of Canonical than a Microsoft or Apple user.

Actually, Apple is the latecomer to this - iOS 7 came out in 2013. Metro and Unity showed up in 2012.

Apple changed because a growing number of people were complaining that the iOS UI started looked "dated" and "static" because it hasn't changed as wildly as Android or as "fresh" as Metro on Windows Phone. Ditto OS X - people were complaining it looks very similar to the way it looked over a decade earlier.

Of course, I hate the new "flatness" that seems to be the trendy thing 0 I like my faux 3D with shading and depth and texture. I admit, iOS perhaps went a bit too overboard with stitched leather and green felt, but I liked the icons and all that.

But I guess that's the breaks. Be like Apple and try to keep things practically the same and after a little while you get accused of ossifying the UI and it looks old, dated, not trendy and ugly. Be like Microsoft and offer fresh and shiny every couple of years and you look cool. Except well, it seems to have come at the cost of functionality.

And then there's Linux where everyone wants to do everything and you end up with hideousness that is Unity.

Don't change the UI and you get accused of ossifying. Change the UI and everyone hates it.

Wish I had mod points today because you're spot on. If someone wants to radically experiment with a new interface paradigm they should fork the existing app and damn well leave the existing one alone.

Especially when it comes to the desktop GUI and the file manager. Give the new shit as an option, but damn well leave the old shit in place as an option too and allow the user to choose which one they want to use.

Coders need to learn to respect users. People use computers as tools and tools should not be arb

The link is to Eric Raymond's "The Luxury of Ignorance: A CUPS Horror Story". It's just what I I point people to when they deal with Windows 8, the new Ubuntu interface, Gnome 3, and the new Fedora installer. Its follow up article, "The Luxury of Ignorance: Part Deux" is at:

The main problem with CUPS is not so much its interface, but the fact that it breaks about daily on many Linux boxes, requiring random job deletions, service cups restart and/or printer off/on. It's the only thing in Linux that still causes daily headaches, unlike for instance sleep and wifi which have worked fine for years on my many systems.

It's become much more stable in the last 10 years, in my observed experience. My colleagues and I have seen issues activating downloadable drivers for Windows clients to use various unusual printers, and the client selection of single sided and double sided printing has been awkward. But we've normally simply set up one queue for single sided, the other for double sided, and that's worked well. And if a particular print driver isn't published in the available CUPS or other GUI configurations, we've had good