It's been one of my major pet peeves on both Android and iOS: the total and utter lack of consistency. Applications - whether first party or third party - all seem to live on islands, doing their own thing, making their own design choices regarding basic UI interactions, developing their own non-standard buttons and controls. Consistency died five years ago, and nobody seems to care but me.

Although older consoles/computers sometimes blurred the line between computer and console, I think the Playstation was the first true console that could launch into a non-game GUI to be able to organise files on memory cards and such. Please correct me if I'm wrong - my knowledge on the history of gaming consoles isn't particularly extensive.

Technically it was beaten to market in that respect by both the Sega Saturn (one month earlier in '94) and the 3DO ('93).

Also, completely agreed with respect to mourning the death of consistency in UI.

Thanks, I'd completely forgotten about the CD32 (and come to think of it, the CD-I). Either way though, the point stands that the PSX is far from being the first to offer that functionality, though it may well have been the first to achieve any real market penetration.

But Amiga CDTV and CD32 don't really boot into anything if there's no media, IIRC (only one buddy with CDTV, one with CD32, everybody else had 600, and it's been some time). They just display an intro of sorts, encouraging you to put the CD in (kinda like all the other Amigas & their floppy animation)

Yeah, they can play audio CDs, but I imagine that NEC TurboGrafx-CD (1988) also does that - and anyway, PS1 added nice audio visualisations (then there's still that memory card manager)

But, see, CD32 didn't include a mouse in the retail package (a boot menu to what?)

BTW, all this talk reminded me about one game (most likely also with floppy release at least for 1200, like virtually all CD32 games), a platformer in which a player-controlled dog is tasked with saving his sleepwalking boy master - does it ring a bell WRT its title? (you are into retro after all...)

Well, just a boot menu. It doesn't make much sense on a stock CD32, but a CD32 is really a Amiga 1200. During the loading of some games or demos you could see a flash of the Workbench screen. Hell, some programs even allowed you to exit it and it would dump you on a WB screen. These were "normal" Amiga programs someone decided to put on a cd.

The CD32 also had one kB(!) of memory for save games, but I'm not sure you could manage this. I wanted to check this on my CD32, but for some reason the video cable it missing.

I totally agree. In fact I often prefer to miss some functionality to installing another application that would look, feel and behave alien. The unholy mess of custom UI elements several times even forced me to uninstall the app right after first run.

In fact I was even considered building a custom GNOME-based firmwares for my phone and tablet, though I'm quite confident that I don't have enough time and knowledge to make it usable.

In the smartphone and tablet age, the application has become the star. The days of yore, where the goal of an application was to disappear and blend in with the rest of the system as much as possible so that users could focus on getting the task done instead of on the application itself, are gone.

Nokia designers managed to nail this problem well in Harmattan. They produced integrated experiences, where the user focuses on the task, rather than on the application. Unfortunately the whole effort was wasted as Nokia management sabotaged their own project.

Metro UI also has such focus - actually, it can be quite safely described as pushing it much more than N9 UI.
(Thom doesn't complain about, doesn't mention Windows Phone UI at all in the above article)

But I don't suppose you can give it that, considering you think Harmattan was so dandy that only sabotage explains its demise...

Parameters are non-standardised (most annoying of all, the -h / -? / --help switches). And the way in which commands are chained can vary too (eg GNU tools tend to allow single char switches which can be linked as one parameter, where as Sun (RIP) preferred full descriptive terms that had to be individually specified.

You also have some apps abusing STDERR (most notably outputting command usage to STDERR instead of STDOUT).

There's the mismatch of true CLIs which are optimised for shell scripting and text UIs (such as ncurses) which can't be piped at all.

And don't get me started on incompatibilities that can arise from using the wrong character set or even $TERM.

Going back to the wider problem though - the one Thom raised - I do miss consistency in design elements. But sadly I think it's an inescapable situation. The real problem is as long as you allow 3rd parties to develop software, those developers will have their own ideas about usability. So you either have to accept that inconsistencies will arise, or write the entire stack inhouse. Neither solution is optimal, but such is life.

I agree with all your points about handling and have experienced many of those annoyances myself. Even with most basic usage, having to alternate between "up,left,right,down" Vs "h,j,k,l", or emacs Vs vi bindings can be a mindf*ck (though thankfully I could change that with most applications I've run), though your point about different config 'styles' will come into play then.

I was talking more about appearance though, in that they're all limited to 16/88/256 colors, character map and typeface. With the exception of some apps preferring *color0 over *backgroundColour, I haven't really seen much variation within the same userland/shell, though obviously my experiences (I've never run Sun or NT before) are limited compared to yours.

Large part of those Android or iOS applications are basically "web 3.0" - they are often little more than custom UI to a single website or feed.
And majority of what Thom wrote in Conceptual part applies to web pages. Plus, WRT one other bit in the article...

users can carry over their experience from one application to the next, making the whole experience of using a UI more fluid. Applications should be the means to an end - not the end itself.

Maybe that's not necessarily perceived as a good thing for app makers (or website owners), users not being very used to particular app UI, and able to effortlessly move to others. As far as those who make them are concerned, apps are the end itself.

I think there's often more consistency on the web than there is between different applications in an OS.

Every website is running within the same browser, using the same window controls, keyboard shortcuts, toolbars, menus, mouse gestures and so on. If I right click on a link, or highlight and drag a block of text, or open a file dialog, or perform loads of other interactions, they'll all work pretty consistently across different websites.

Even when it comes to the websites themselves, there are certain design elements that are relatively consistent between similar sites.

Most of the forums and blogs I read have a similar layout and comment system. If I go to one I haven't visited before it's highly unlikely that there's anything new I'll have to learn.

I've been shopping around for some PC components recently. Before even visiting a new online store I can guess where navigation elements will be placed, and how product listings will be laid out. The vast majority follow roughly the same template.

In my experience, websites that go their own way and break conventions with a radically different design have to be very well thought out, otherwise the inconsistency will really annoy users.

Consistency became a casualty almost nobody ever talked about. A dead body we silently buried in the forest,

I mourn the death of consistency. I may be alone in doing so, but every death deserves a tear.

Been watching sparkly vampires again, eh?

-------------------------------------------
PS. No, I actually just recently lambasted Microsoft for completely ignoring UI consistency and design on my Google+, noting that they themselves are one of the worst offenders on Windows-platform. On Linux if you stick to only GNOME-applications or only KDE-applications you actually get a whole lot more consistent look and feel, but that obviously does not help people using other OSes.

The thing is that UI consistency cannot really be fixed by random people, the push for consistency must come from the OS/desktop environment/hardware manufacturer/developer. The people in charge just think "let's make our app stand out like a stick in the eye so that people will remember it!" unless there is a real disincentive for doing that, so something like e.g. taking extra tax on Windows Store/App Store/etc. applications that break UI consistency could possibly work. Another approach would be to push such applications out of the front-pages and to favor applications that stick to UI consistency - guidelines.

I never really paid attention to UI design until I started reading OSNews. Thom, your desire of UI design has inspired me to think more about user experience as a developer and I thank you for that. I think there are a lot of real-word issues that indirectly affect UI design in addition to what you are describing. I am sure a lot of these issues are inferred from your article, but I would like to put it out there to see what any other developers might say.

There are a lot of designers out there...but very few UI designers. I think when companies hire a designer they are looking for someone to make a pretty app and not provide a good UI experience. So instead of looking for someone who is qualified in UI design, they instead look for someone who has an art degree, a pretty portfolio, and MAYBE knows something about UI design in general. Nearly all the designers/UI designers I have worked with were never trained specifically in UI design, but Advertising Design. There are also a lot of times when the Designer is just the person who came up with the idea for the app as well or that person has the most say in the design.

I have not done much research, so this is all anecdotal, but it also seems to me UI Design education programs (or students of) are very rare relative to the amount of apps being marketed. It's very rare to find designers whose sole focus is UI Design, 'Human Interaction Design', UX Design, etc.. Perhaps its not rare as much as it's just the ratio of real UI designers to app developers is very low.

Another big fault is that in a lot of development shops, the 'Designer' is/was a very experienced developer who is now doing design and/or architecting. Again, in my experience, this has been very black and white; either the 'Designer' is very good or not so good. Regardless, it just seems to me that once you have a deep understanding of what goes on behind the curtain, you have an even more challenging time designing the whole entire show. I just think your focus is more on engineering with parts of a program than interacting with a person. Some developers get over this hurdle well whereas others don't.

Although I have only developed a handful of mobile apps, reading your article made me realize something that I feel every mobile developer probably struggles with. When we are coding our app, we are going back and forth between our code and the app on the device (virtual or physical). That's it. We're not spending time in any other area of the OS worrying about integration. When Q&A get's it, they spend all of their time inside the app only. When the designer is designing, he/she is looking at a screen of solely that App and nothing else. It's like their canvas. They don't think about how it might look in the Gallery, so-to-speak.

I guess what I am trying to say is when developing an app, the app development process is never thought on the terms of "Ok we are developing an iOS app" and really absorbing that idea. It's more like "Ok we've got this app and we have to get it out on the most popular platform that's out there right now. Today that platform is: [insert OS/web here] and this app needs to be out the door yesterday."

Anyhow, this is all my anecdotal experience/opinion. I appreciate all the designers I've worked with and please know this is just from the point of view of a lowly developer. I have never been in the position of designer and don't know what are all the challenges they must face. I am curious to hear from any other developer. Overall though I really appreciate your passion for UI Design and hope more people pick it up.

I have never been in the position of designer and don't know what are all the challenges they must face. I am curious to hear from any other developer.

Developers are generally the people who write the actual functionality of the software, whereas designers are the people whose job it is to figure out which functionality should be exposed where and how. Some developers make fod good designers too, but most often than not the thing is that coders just don't have the required eye for visual design. That's why there are actual UI guidelines written for e.g. both KDE and GNOME that go to quite extreme lengths to explain how an application should present itself and its functionality in order not to feel out-of-place on the desktop. Such UI guidelines are what I personally wish every platform would mandate developers to adhere to.

As a developer you might not always have a designer at hand, so there are a few things to keep in mind: separate all -- or atleast most -- of your functions to obtaining/generating content/data - ones and to displaying the said content/data - ones, and then try to imagine flowcharts of how one would accomplish this or that task; if a user is in the 'default' view what are the steps needed, can these steps be simplified, are these steps ones that are often needed and should therefore be prominently present in the UI, what if the user not in the 'default' view, and so on. It may feel boring and tiresome at first if you haven't done that before, but you'll get used to it and eventually you'll notice you're already doing these flowcharts in your head at the same time as you're doing the code itself, too.

Even though it was ugly as Hell at least NextStep or OpenStep had a consistent UI for the OS and its apps. GnuStep followed the same model but few embraced the concept in Linux, instead opting for the "other" WM's like KDE and Gnome.

Developing apps for GnuStep is relatively simple since you don't have to reinvent the UI in each new app.

Is consistency really necessary to fluidity and ease of use? I do not think so.

-Because the learning phase is short anyway, the cost is not very high.

-We are still inventing new, original way to make an application work. Consistency kills a lot of potential inventivity in UI interaction, especially moving to the rather recent and unexploited touch-based input (think that trackpads did not have multi-touch before! Two-finger scrolling should have been there earlier, instead of the shitty invisible "zones" on the sides of the trackpad).

-I have a really hard time accepting what is implied: that there is out there a "one-size-fits-all" set of UI guidelines. That seems to me a belief without grounds. Sure, it does look sensible; is it? Another illustration that the difference between theory and practice is practice.

-There is another fundamental difference between desktop and mobile computing you overlooked: mobile apps are "glanced at" in a hurry, in a time-out, in-between real-life activities or in the middle of one. The large differences between apps helps me get instantly which one it is. I am sure I launched the right one, and if I come back to my device, I know which one I am looking at or switching to.

Nevertheless, I see the interest in having some core functions of a device being consistent. That is already the case, as Apple and Google usually develop those applications themselves: email, text, contacts, maps, app management, etc.

But would you recriminate against the fact a FPS does not have the same controls than a flight sim or a RTS? Those are all games, aren't they? Why should I relearn how to do things? (Do you see the flaw?)

But would you recriminate against the fact a FPS does not have the same controls than a flight sim or a RTS? Those are all games, aren't they? Why should I relearn how to do things? (Do you see the flaw?)

You just proved my point. Equating applications to games is the very problem! A game is entertainment, an experience - it doesn't help you accomplish a task. The game itself is the goal.

A game in and of itself is an end - not the means to anything.

An application, on the other hand, should be the means to an end - not the end itself.

No, I don't. The fact that a game defines its own goal(s) is irrelevant. And are you sure a game UI "doesn't help you accomplish a task"? You must have played only really bad games :-P

More to the point, the "entertainment" or "experience" is more related to the content of the game, not to its UI. You can argue the opposite since the apprition of kinect/move, but for the vast majority of games, it is just keyboard+mouse and really diverse, genre-specific UI and behaviors (right-click gives a different response in each game genre, and even between RTS titles, for example).

Once a task is defined - whatever it is, whoever defines it -, the best way to get it done may not be one "standard" UI, and games are an extreme proof of that.

If you think there is a standard UI which is able to encompass all possible data input and manipulation - or a set rich enough to feel "complete" -, I expect to see a proof of that. I am genuinely interested.

You'd be surprised just how similar controls games in the same genre have. Play one FPS, and you can play them all.

As far as actual game GUIs go (menus and such)... I hate how every game has entirely different in-game menus. Some apply when you change a setting, some require you to select an on-screen button to save settings, some require you to press a specific button on the controller. Some have HUD settings in the gameplay menu, some have it in the graphics menu. And so on.

But even then, you can't just equate games with applications - especially not mobile applications. You usually play a the same game for longer periods - days, maybe even weeks. Try playing Fallout for a while, then switch to Left 4 Dead. Curse the failed reloads and jumps - because the controls are different.

Now, imagine making this shift not once every week or once every few weeks (as with games), but several times per minute. Mobile applications are in-out-in-out, very rapidly. You don't spend a lot of time on each.

But would you recriminate against the fact a FPS does not have the same controls than a flight sim or a RTS? Those are all games, aren't they? Why should I relearn how to do things? (Do you see the flaw?)

But hold on, games is way too broad an area like that. Different game styles do different things, and so different controls are a necessity. FPSs and flight sims are as different as word processors and calculators. Take FPSs though - they tend to all have similar controls and even displays. You can go from one game to another by a different publisher and just start playing, no need to re-learn everything. WASD and the mouse and you're sorted...

One of the first things I do with a friend's shiny new HDTV is turn off all the glaring edge enhancement, saturation and contrast that obliterate image quality. It's just there to make the TV stand out in a line, not to give you a good picture. Similar to inconsistent UIs.

A key part of UI consistency for traditional operating systems is the key-binding schema. Windows, Linux, and cross-platform GUI applications tend to use CUA keybindings from MS-DOS EDIT.COM, and there's a long *nix history of using either vi or emacs keybindings for everything, down to the very shell (set -o vi and set -o emacs).

I've never been one to think consistency was all that matters to usability.

Was WinAmp consistent with the windows UI? Nope, but it was very usable and a very popular music player.

Usability is very important. But usability is not dependent on consistency.

Usability is driven from many areas, one of which is consistency.
Other are:
Domain. Winamp looks like a traditional media player. Ebook readers can emulate paper to turn pages...

Tradition. You might want to comply with a new paradigm, but people are still used to the old one. It might be more usable to switch over gradually.

Application Model. Sometimes the usability of an application goes beyond the standard model. Perhaps to fit complexity. Perhaps to experiment with new UI paradigms. Kinda of like the Ribbon in MS applications or complex applications like autocad...

Was WinAmp consistent with the windows UI? Nope, but it was very usable and a very popular music player.

It didn't really cease to be popular. On http://store.steampowered.com/hwsurvey/ (sure, certainly not the most accurate, but it gives some idea) iTunes has 30%, Winamp 20% - quite comparable.
Plus it appears to be delineated geographically (which most likely also influences the Steam ranking) - iTunes dominant in some places, Winamp in others (from what I see, it is still widely used in the former Warsaw Pact for example)

And yeah, one can argue Winamp was consistent - with the UI of typical CD player; and afterwards with itself.

I know that probably 90% of all android apps that do not follow the standard guidelines for UI design are either lazy ports (like instagram), poorly thought UI, or not even thinking about the user experience in the first place. But, on the other side of the spectrum, we something as Windows Phone, which is very consistent, but very little customizable. A windows phone is identical to every other windows phone on the market.
I'd rather pay the price of inconsistency for the benefit of being able to think in new ways to display information and interact with the user.

Besides the fact that the mobile UI with touch (compared to stylus) opened up the doorway for people to experiment, I have always assumed that the lack of UI inconsistency is due to the the way the app was developed.

There are rapid application SDKs out there that allow a seasoned html or php developer, with little knowledge of UI discipline, to whip up an android or an ios app.

Great article, it articulated something I intuitively realized for a while but couldn't quite put my finger on. Consistency in UI is important for ease of use and minimizing learning. Too bad so few understand that anymore. I think it's partially because of the nature of the App Store, and partially because so many young developers want to write an app to get rich before they've matured in computer science (they have no sense you actually have to learn something before becoming the next famous hacker lionized by the media).

Amen to that article. I really don't want office suites and image editors to feature gesture-sensitive fluffy bunnies and animated paperclips, no matter how pretty that looks, I want something that works well, is easy to understand, and gets the job done. The problem, I guess, is that this is incompatible with the way software is developed nowadays.

In the mobile world, developers are treated like crap. They have to sell their work for a ridiculously low price, lose a third of that meager income to the OS vendor, and drown their hard work in the invisible depths of a unique vendor-controlled repository, from which it can be banned at any moment. In these conditions, no one wants to, nor is able to, develop quality software, and there is a shortage of good developers who have more interesting stuff to work on elsewhere.

Or is not possible, but the cost overweight the benefit. Let's say you're Adobe, and you want to make your software that is consistent with its own software. So you care to have fewer issues with your user. You will invest a bit more to be consistent with the OS, but you cannot warrant that all software is looking the same. Continuing to build on what Adobe has, will want to extend using current frameworks, so if it wants to add a new product in its list of products, it simply have to add the new functionality, but cannot extend it to all OS/platform combinations.
In the end I think that consistency is killed by the wish of companies to differentiate themselves, and this is bad and good too. Imagine cars looking the same and behaving the same, why not to improve the usability, even at expense of consistency?
At the end consistency can be made just if the framework will not let you change too much your application. Firefox or Libreoffice does not look consistent in Gnome, but who would spend an year of his life reviewing all dialogs and make sure that they behave the same? At the end, some things were inconsistent all-together like "tabs on top" which all browser use today.

In the end I think that consistency is killed by the wish of companies to differentiate themselves, and this is bad and good too. Imagine cars looking the same and behaving the same, why not to improve the usability, even at expense of consistency?

Cars don't help your argument... they do, in fact, behave virtually the same - control scheme is very standardised (pretty much optimal; and attempts at "improvement" - side-swinging ~joystick at the front of central column, for example - didn't really work out; maybe it will come with autonomous cars, this one).

They also look pretty much the same - differences aren't as large as marketers want us to believe: I think that, when we look at the past cars, our minds see them primarily as "20s-30s cars", "50s-60s cars", "70s-80s cars" or such; collectively.

I used to think that having a consistent UI was important, but now I am not so sure. Everything we use in the physical world has a UI specifically designed for it's purpose. Do you think that all door handles, should look the same no matter what house or building they are on? Do you think we should use door handles on cupboards, refrigerators, and cars too?

Our physical world is all about "apps" (i.e. discreet packages of functionality). There is very little consistency between many things we interact with on a daily basis. Why should we expect it to be any different on a "general purpose" computing device? Computers are used for so much more now than they ever were, that I don't think we could ever come-up with a consistent experience that wouldn't suck.

OS developers have no idea what apps people will be writing. There is no way we can expect them to come-up with a UI that will work for everything. We need people to be able to challenge UI assumptions so that we can continue to develop and evolve these ideas. If we all just leave it up to the OS developer to decide on how we should interact with the device, I think that we have a good chance of hindering some really cool advances in human/computer interaction.

Everything we use in the physical world has a UI specifically designed for it's purpose. Do you think that all door handles, should look the same no matter what house or building they are on? Do you think we should use door handles on cupboards, refrigerators, and cars too?

Poor analogy. Door handles actually tend to be very consistent within the set they belong to, ie. it would look extremely silly if every cupboard door et. al. in your kitchen had differently coloured, differently placed and differently shaped handles, wouldn't it? Similarly, in a house the doors that belong to a certain set, e.g. full-size doors meant to allow or disallow passage for humans, almost invariably share similar set of handles unless necessitated by outside requirements/restrictions.

In other words, there is actually a lot of consistency there and you just basically shot your own argument down. Consistency does *not* mean that everything must look and feel the same even if makes it harder for the user and/or task to accomplish a goal or a step to a goal.

I think you missed my point. The article happened to spend a lot of time talking about how things look. To quote it here:

"Almost every application does things just a little bit differently, has elements in just a slightly different place, and looks just a bit different from everything else."

My point was that real world UI can be very different for similar actions. Cupboard doors and car doors are very different from UI standpoint, yet they essentially do the same thing - open a door. In some cases we twist a handle, in other cases we pull-up on a lever, and yet in others we just pull on a knob. There are doors of all sizes on all kinds of things, and they many of them can behave very differently. We don't expect everything in the real world to behave the same way, that would be ridiculous, so why do we expect that from our applications?

My point was that real world UI can be very different for similar actions. Cupboard doors and car doors are very different from UI standpoint, yet they essentially do the same thing - open a door. [...] There are doors of all sizes on all kinds of things, and they many of them can behave very differently. We don't expect everything in the real world to behave the same way, that would be ridiculous

We do expect everything in the real world to behave the same way. All kinds of doors are very similar from UI standpoint, behave very alike. Real world UI is very similar for similar actions. There is very high consistency between many things we interact with on a daily basis.

BTW, we describe the rules and similarities in which physical objects work in real world under the physics umbrella term... (and while "common sense" physics is flawed, as evidenced for example by many silly ideas before Newtonian mechanics came along, it is close enough)

OTOH, computer displays don't have such limitations, and it often shows. Come on, look at the currently-trademark interactions on capacitive touchscreens, that of swiping things while barely exerting any pressure by your finger, enlarging them with two-finger-gesture, or grabbing and moving objects causing them to become "transparent" without much concern for any ~barriers in their path - virtually nothing works like that in the real world (yet of course we embraced those, we like them; but many other - not really)

Do you think that all door handles, should look the same no matter what house or building they are on?
Consistency is more about function than about looks. All door handles work the same, you push them down and they open the door.

I once was at an airport with round door handles on the bathroom stalls. The locking mechanism was also combined into the door handle. I couldn't figure out how to lock or even make the door stay closed so I had to take a shit while keeping the door shut with my hand.

I would highly apreciated this door handle and lock to be consistent with other door handles and locks.

This philosophy of putting consistency on this giant pedestal reminds me of the strict non-expressive philosophy of 60s modernist graphic design. They use Helvetica for everything, because Helvetica is neutral, believe that type should never be expressive because the meaning is in the content. Today many like the style, but very few share the philosophy that design should not be expressive.

Today graphic design is expressive. You can look at a poster and guess the content based on the typeface, colors, texture, etc. I think we are seeing a similar development in the world of UI design. Having the notes app actually sort-of look like handwritten notes gives people visual cues to what this application and makes it easy to understand. I would not be surprised if such visual cues makes the app disappear than a pure consistent app would. These apps are only different visually, in behavior they are often very consistent. You also have apps that take things much futher, such as Convertbot, Clear or Paper. I think breaking UI conventions is completely acceptable if they make the experience better. Personally I find Paper to be far more invisible than the other more consistent sketching-apps for iPad.

I think there is more than enough room for both philosophies (and everything between). Vote with your wallet and buy the apps that work well for you.

Both Android and iOS have certain requirements in the application that nearly ALL applications abide by - the button on iOS that moves back to the main screen, and the 4 buttons (back, home, menu, and search) on Android. These all provide a very basic consistency on the platform.

The rest of the application is as applications have always been - fluid, integrated in ways that make sense for the application, and more.

I haven't gone over iOS programming models yet, but Android from the basic model for developing it puts in place consistency in how things interact. Sure, the applications Activity interface may differ significantly - but look at applications on OS X, Windows, and Linux; it's no different there no matter how far back in time you look.

Consistency is not necessarily about how something looks, but how it interacts with the user. And in nearly all cases for iOS and Android that I have seen that consistency is there - from both first-party (Google, Apple) and third-party applications; from newbie developers to entrenched development houses.

We've reached a point where it's entirely acceptable to reduce functionality just to make an application look good.

Such a depressing statement... made even more so because it's true. Just look at Firefox as they try to rip off all of Google's Chrome worst designs, which seemed to have taken Apple's design ideas to the extreme.

I have to admit that there was one area where I didn't care so much about consistency: audio players. Winamp, Sonique, QCD, QMP... they all looked nice and resembled, in a way, a traditional hardware audio player and were fully functional. Still, those like Foobar2000 are nice too in their own way, with their traditional UIs that fit in with the rest of the OS. Now software audio players these days... where they get rid of the stop and pause buttons, leaving you with only one big button that changes between the two based on the playback state and a back and forward button beside it... I better shut up, I'm starting to feel like I have to puke just thinking about it.

I totally agree. Browsers are over the top (and not speaking about the content they display).
This is why I value SeaMonkey so much: traditional, proven interface design with a powerful engine. And I remember when "Firefox" should have been more lightweight than the Mozilla suite. now it is almost worse...

I agree good UIs are fewer and further between because of the lack of consistency. I submit, however, that the cause may be generational in nature.

First, you should note that "different" can cause confusion which can lead to frustration, and perhaps, eventually anger. Consistency is a mechanism that minimizes unnecessary differences, and hence, confusion and frustration, and perhaps even anger. That said....

As we age we become more and more creatures of habit. When confronted with new devices [and interfaces] we tend to expect to trigger "common" functionality in ways consistent with our prior experiences. Unnecessary differences confuse and frustrate us.

What is interesting to me is that the younger generation seems to adapt to newness better than most old folk, perhaps because they are not encumbered with a lot of prior experiences. And because today's younger designers do not feel the confusion and frustration, consistency is not a major consideration in their designs.

But don't give up. Have hope. The consistency pendulum will swing back as this younger generation gains experience, gets confused, and feels the frustration. Our challenge is to live long enough to see the resurgence. One can only hope.

I can only agree. User experience is nowadays terrible.
I don't care nor follow the tablet/smartphone market very much, but I see its influence in how the desktop application gets ruined. The Macintosh gets iphonized every new release. Some apps get absolutely horrible pieces of interface just to "mimick" other touch devices. I'm thinking of skype for example! Or the way iTunes and the mac Finder developed.
But there is something even worse: web applications. The "mighty cloud" makes terrible applications with inconsitent user interfaces, poor element design. And they even need to change from release to release so that the application feels "fresh".
Gone is the time of beautiful NeXT applications, clean Macintosh Human Interface Guidelines...

That's why today I can just work on GNUstep. Of course other users will try it and toss it: it doesn' t have those Brushy windows apple haas, or other useless flashy effects.

Bauhaus design taught that From follows Function. The user interface is part of the "form". But today the first thing must be the looks, the function must be somehow "fit in".

Don't over-dramatise... user experience nowadays enables for tons of people a quite comfortable use of their ~computers - much more so than was the case in the past. Many (most?) even like the new ~smartphone models of interaction, new paths and experiments, which will likely help many more in efficient usage of ~computers.

OTOH, you hold on to a GUI ...basically from the beginning of GUI. It's a virtual certainty that's it's far from optimal (and it didn't see very much uptake or strict cloning).
Movies or music are always better in the old times...

okay, so android has a back button that works IN EVERYTHING but does different things in different places! WHAT! and it has a menu at the top and sometimes at the bottom AND THE ICON IS DOTS, WHAT DO DOTS MEAN?????? and long pressing for options is an ingenious use of a touch screen BUT IT USUALLY SUCKS and did I mention you arent allowed to close or minimize or pause apps, it all happens automatically BECAUSE WE ARE RETARDED? NO, stop it, stop it, we arent going there right now, we arent going to the bad place

Mr. Holwerda,
Why didn't you consider the fact that, the screen MEDIUM FORMAT for handhelds is NEW, as is the way we interact with handhelds? IMHO, the reason developers go for custom UI elements basically is that this is a new format for displaying information and interactivity, therefore the way it's being used hasn't yet set into the conformity, which is the 'big screen' UI, like windows and MacOS has! But the main experience of constructing a useable UI comes from those exact 'Big Screen' devices
The rules of interpreting and understanding what you see and do on a small handheld device used by your fingers or stylus is rather different from 'usual' computing devices, meaning, these rules haven't even found a best practice yet. What UI framework aesthetic works for email apps doesn't necessarily work for a space simulation game, etc! So the developers are busy trying to explore the boundaries of what is possible, practical and beautiful on these type of devices!
Give it time, and there will be a standardization of UI! I have faith...
You could call it Format-specific UI evolution!

In your ideal world an OS maker would come up with guidelines and all developers need to follow these? What happens if the OS maker has bad guidelines or stops innovating?

UI is organic. Consistency to a certain degree is very useful (especially behaving like the user expects, feed-forward if you will) but too much of if will stall innovation. Good things come from experimentations.
If people don't like it, the app in question will not sell and therefore die a horrible death.
Also I think a UI should be consistent with itself, not too much across the board of an OS. I like different scrollbars when I'm editing a video, I like big buttons when I'm 75 years old, etc. Consistency stands in the way of giving the app personality.

I bet people even think an app is easier if it's inconsistent to another one. You quickly recognize how you should behave and act so the app doesn't blend in with other apps.
People get used to 'inconsistent GUIs' very quickly, a good thing

I kind of agree, in that on a desktop platform I always valued consistency, and Windows was a pain to use because of the lack of it. That's why I enjoy the mac and formerly linux much more.

However as usual you tend towards the hyperbole.

the biggest change in UI design caused by the iPhone, and later Android, is that 'consistency' lost its status as one of the main pillars of proper UI design.

This isn't backed up. Both platforms have extensive UI guidelines. Many things about them are more consistent than on a PC. For example, both provide a much more extensive and standard set of classes and widgets. While Windows is a hodgepodge of dozens of toolkits, there is only one major standard on iOS and Android. So 95% of the table views you'll see are the standard platform ones. Same with 95% of the toolbars, buttons, toggle switches, etc, etc, etc.

If some website reports on a new application for iOS or Android, the first few comments will almost inevitably be about how the application looks - not about if the application works.

If an application has gone out of its way to look good, then people will comment on it. But that doesn't give the app a free pass if it doesn't work. The thing is that the apps where people have made the effort to make it look good are also the ones that tend to have the attention paid to working well.

We've reached a point where it's entirely acceptable to reduce functionality just to make an application look good.

Name one example where an application removed functionality in exchange for looks and demonstrate how users thought that was acceptable. I don't believe you.]

We give more accolades to a developer who designs a pretty but functionally crippled application than to a developer who creates a highly functional and useful, but less pretty application.

Again, you just made that up. Who gives more accolades? Certainly not on this site. Show us the reviews where that is happening. I mean real reviews, not some random person posting on twitter.

'I give up, I know my application needs this functionality but because I don't know how to integrate it, I'll just claim I didn't include it because users don't need it'.

One example of that happening, just one.

In order to not drown in this highly competitive tidal wave of applications, you need to stand out from the crowd.
A highly distinctive interface is the best way to do this.

No. A good app is the best way to do this. Good apps look and work nicely. I have never seen a single app that looks good but doesn't work, and yet is highly rated amongst its peers.

Sadly - I, as a user, suffer from it. I don't like using iOS. I don't like using Android.

Sadly, you are in the minority. I love using my phone, and hundreds of millions of people do too.

And now a note about consistency from someone that actually designs user interfaces. Functional consistency is important, but visual consistency isn't (to a point of course).
It doesn't matter that certain apps have toolbars that are blue, and others are black, and others are green. This does not hurt usage of the app one iota and doesn't require extra thought. If you disagree, please point to a usability study that shows that interface colours and graphics decrease interface performance.

Consistency in behaviour is important, but also not critical to a good application. As others have mentioned, creative UI can be more productive than standard UI. Winamp is one example (very compact UI that could live in a titlebar when always on top). Another example that I use every day is the Remember the Milk app. The new version uses a very cool UI with multiple cards that can be swiped into view, and it is significantly faster to work with than the old "consistent" version.

Consistency is a crutch. It's a useful crutch, and in most cases it will make a perfectly good UI, but it is still a crutch. Most people that throw away standard components to make their own will make a worse replacement. However the 2% that are actually good designers and pour hundreds of hours into designing their own table view that perfectly matches their app will make brilliant things by smashing through the requirement for consistency.

The release of the iPhone, and more specifically of the App Store, changed user interface design practically overnight. A lot of people focus on the shift to finger input as the biggest change the iPhone caused in UI design, but in reality, the move from thin stylus to fat stylus (the finger) isn't that big a shift at all. No, for me, the biggest change in UI design caused by the iPhone, and later Android, is that 'consistency' lost its status as one of the main pillars of proper UI design.

Thom, the shift was not from stylus to finger. It was from pointer to multitouch. Instead of dragging on a scroll bar with a stylus, you just flick the content up or down. This directness in interaction makes a huge difference. It seems obvious now, but no one was doing it at the time.

Also, consistency is not just about how widgets look. Apps are more immersive than ever before. While people used to talk about interface, they now talk about interaction. What used to be merely UI is now UX. Old school UI, like you said, is just that - old school. UI by itself is not enough.

You are focusing on the half empty glass. I see an overflowing glass. Developers have never before put so much emphasis on UX as they are do now. People are experimenting with the new media. And there's a lot of them. Think about it. We used to have a keyboard and a pointer. We now have gestures on a 2D plane (multitouch), gestures in 3D (kinect, accelerometers, gyroscopes), voice assistants (siri), locality (GPS, NFC). Sure, access to all this is new, and people will make mistakes. But believe me, developers have never been more acutely aware of interaction design than now. That's what I'm seeing in the trenches.

I remember back in the day when people used to give Apple a hard time for encouraging everyone to adhere to a set of UI guidelines.

On the DOS (then Windows) side of the road, you could code your UI pretty much the way you wanted.

Freedom was the cry. Why be limited to a companies perceived ideals for an interface.

Now I hear we have too much freedom in this arena.

Actually, Apple still encourages us to adhere to guidelines for both OS X and iOS. When you use XCode, it helps encourage you to create the UI in the "Apple" way, giving you (in most cases) guides right down to the pixel.

The problem is, the "people" expect something a little more (sorry for the word) "sexy". If you have two apps, one is a boring stock standard app, and one has a sexy (sorry again) UI that just smokes and sparkles, then the nicer looking one will get the $$, not the boring one, even if the boring one is easier to use, has more features and so forth. Well, geeks and some users work out which is better, but a lot of $$'s is tied up better most people don't.

I don't have an answer, personally, I prefer the boring guidelines way of doing things, but I really appreciate sparkle at times. Garageband is a good example, dials, foot pedals and so on make sense.

I guess in the end, keep as much consistency as you can, but have fun with your apps too, and where appropriate, even create something new.

The same happend with movies. Old movies are slow and simple. Modern movies use many symbols (like using blur to represent dreams) that we know because we grew up with them.

The same goes for applications. The repertoire is bigger now as users have more GUI experience. However, what is productive depends on context. A DAW can be less intuitive, but still provide a better workflow. You will be able to learn Abelton or Cubase in 2 evenings. That is OK.

For a single visit website Immediate productivity is imperative and you have to play up to the sites or GUIs the target audience is used to.

it does annoy me however when ios designers dont use swipes for page turning. or when ios safari doesnt seem to have undo. or when websites sends me to an inferior mobile version of their site. or when ios and other touch guis require precision touch, like getting rid of autocorrection. so some consistency with external use is important, but not on the widget level.

...comes not from designers... it comes from people higher-up in the company hierarchy who think because they project leaders and CEOs that they automatically have design knowledge and know more about UI design then their designers. They insist that there ideas (make it pink - I like pink, make that bigger no one will see it...hmm that makes everything else look less important, make those other things bigger too, and add icons so that people know which bits they should be loooking at...) make it in to the design until what you have is a horrible mess. Sadly their ideas are more often than not, based on their person preferences.

I remember the Amiga Style Guide, which came with Kickstart/Workbench 2.0 (1990) and most serious app developers adhering to it. And I remember switching to Windows 95 when C= b it the dust and being very annoyed with the lack of consistency. (amongst alot of other things)

Some of the first apps to annoy me with inconsistency were CD and MP3 player apps. For some reason it was unquestionable that such apps should try to look like a Hi-Fi system rather than a computer program. They weren't even allowed to have normal window frames.

Sorry but this entire article could have been summed up as "I have OCD and I fear change". Just because you have problems with things being different or out of place doesn't make them worse - these inconsistencies have finally made computing open to the masses. Consistency never helped "normal" people use a computer - iOS has along with the apps.
These things irk me as well but I can see that I am wrong as I am sure can you. I suspect that your cutlery is all the same way up in the drawer, and a slightly open drawer must be closed