Posted
by
Soulskillon Wednesday September 19, 2012 @08:13AM
from the how-many-devices-have-they-sold dept.

Hugh Pickens writes "Austin Carr notes that a number of user interface designers have become increasingly critical of Apple's approach to software user interface design. Much of their censure is directed against a trend called skeuomorphism, a term for when objects retain ornamental elements of the past that are no longer necessary to the current objects' functions, such as calendars with faux leather-stitching, bookshelves with wood veneers, fake glass and paper and brushed chrome. A former senior UI designer at Apple who worked closely with Steve Jobs said, 'It's like the designers are flexing their muscles to show you how good of a visual rendering they can do of a physical object. Who cares?' The issue is two-fold: first, that traditional visual metaphors no longer translate to modern users; and second, that excessive digital imitation of real-world objects creates confusion among users. 'I'm old enough, sure, but some of the guys in my office have never seen a Rolodex in real life,' says Designer Gadi Amit. 'Our culture has changed. We don't need translation of the digital medium in mechanical real-life terms. It's an old-fashioned paradigm.' One beneficiary could be Microsoft, where the design of Windows 8 distances itself from skeuomorphism by emphasizing a flat user interface that's minimalist to the core: no bevel, no 3-D flourishes, no glossiness and no drop shadow."

Apple's one-menu-to-serve-them-all approach is decidedly unfriendly when you have more than one monitor, as more and more of their machines come out of the box ready to operate with (and a machine like a Mac Pro can trivially be configured to run quite a few monitors.) But even a Mac Mini or a laptop will run two. What happens is that you're off on one monitor, you need a menu operation for the app you're working with, and the menu is 1,2 or perhaps six monitors of mouse-travel away. Menus on application windows make a great deal more sense.

Typists -- by which I mean people who really type a fair bit, like writers or serious programmers -- are not served well by Apple's low profile "chiclet" keyboards. Apple gets the shivers by making their devices thin; but this means that keystroke throw is short, and what we end up with is a mushy keystroke.

In the middle:

Apple's one-app-at-a-time system UI messaging approach means that you can only send keystroke events to the active application. So, for instance, were you to attempt to write program B to automate program A, and the user happens to be using program C, any attempt to control program B from program A will require you to shift the user's focus from program C to program B, which is decidedly unfriendly. Applescript's mechanism for automation requires activation of this app, then that app, which means that the user can't be trying to use the machine when the Applescript is running. Which is kind of a serious faux pas for what is nominally marketed as a multitasking machine.

There's no inter-program messaging paradigm other than the network. No named ports, etc. This also has severe implications for automation.

At the bottom:

UDP messaging is used to send network events in a broadcast manner. Apple's implementation of UDP only allows one program on a machine to bind to a UDP port, meaning that only one program on that machine can catch a broadcast -- which in turn means that if your implementation really needed a broadcast mechanism, you can't use UDP for it.

---

That's just a sampling of UI issues with the OS. Against these rather immediate problems, I find the whole issue of make-it-look-like-[object] to be silly.

Don't know what the [object] is? It's a one-time learning trip down memory (or history) lane, and you're up and running. Operation is easy, even if, lawd forbid, you had to learn something.

On the other hand, when you need to get at a menu across a bunch of monitors, you're kind of hammered. It's time to go hunt for a third-party fix. If you need to really type, it's time to go buy a keyboard from a third party. If you need broadcast, I hope someone warned you the UDP stuff is broken so you don't waste your time trying to use it. If you're trying to implement IPC, well... [hollow laughter] I bet you'll wish you were working under Amiga OS before you even get seriously started. And no, Applescript won't get you even close because of the above-mentioned application focus issues.

Indeed. Besides it is just textures. Architects went nuts in the 20s ripping out all texture and decoration to create a clean pure look. But in the end people found it cold and inhuman, and cold concrete and metal gets boring. So it just helps to have a bit of variety and decoration. Not everyone wants to live in MUJI world or a Rietveld Schröder House. Besides the textures offset the clean simple hardware. And when looking at a screen all day, a bit of variety helps. Of course one might not like the particular textures, that's a different matter.

This anti-skeuo fad is basically an artistic/aesthtic movement. Like or hate it, I don't think they were looking at Win 8 from a functionality standpoint. It is the visual design equivalent of a group of people going around saying "You know, full service gas stations barely exist any more. We should go on a crusade against people using the term 'fill 'er up'."

Or more geek/nerd realted "Why the heck is a 3.25 floppy the symbol for saving still? We should invent some entirely new symbol or just use the word 'save'. Why are folders looking like folders on computers? That's lazy thinking. In fact, we should consider calling them something other than folders too...".

As one designer I follow thinks, we need to get rid of the idea of "Save" altogether, and just have some sort of "Undo everything I did in this whole entire session" button. Saving is not a concept that people without computers are familiar with, so it's an idea that was invented FOR computers, and it's becoming increasingly unnecessary. It's not as scary as you think after you get used to it. Objects IRL like a whiteboard/todo list/grocery list on your fridge don't need to be saved. Construction workers do

Depending on the usage, having an "unending undo" could be prohibitively expensive. I can't imagine the requirements of such a system while in a large photoshop document for instance. Just having 20 history slots in such a document can get unwieldy. I think if we erased saving altogether and went with unlimited undo/instant commit, we'd start seeing some sort of "snapshot" type system in the undo stack which essentially just reinvents saving in a traditional sense. Sometimes we want to start form a specific

Maybe we should go further, and get rid of the concept of computers - after all, computers are not a concept people without computers are familiar with.

There are many things people do in real life that are purposely throw-away, including builders. It is a terrible and retarded idea to think that everything must be saved instantly. All people have a mental concept like "scratch space" or "throwaway work". Even builders will try certain new things, e.g. techniques in a test 'throwaway' or practice environment before doing them on a real project. If it were true that everything should be saved instantly, why do we even bother having a pre-submit stage while entering forum comments? Hey, let's just show our forum comments live, continually even as we are typing them. This 'continually save instantly' meme is terrible - what is worse is that now we will have an environment where some software users get used to not saving their work, and then lose work in other software. But then maybe that's the idea - that users will subtly get angry at other non-Apple software for 'losing their work' - a sly psychological manipulation.

As one designer I follow thinks, we need to get rid of the idea of "Save" altogether, and just have some sort of "Undo everything I did in this whole entire session" button.

Apple IS trying that. Starting with Lion, they've been experimenting with such things - questioning why do we have "save" buttons, or even "quit". Granted it puts people off kilter, which is why they had to call it "Save As" in Mountain Lion, but the basic concept was the OS manages saving for you in the background. If you hit Save, it

I love their idea that we should update the application icons to be represented with more modern tools. Here are some icon suggestions for different apps that people might use based on the tool that they now use for that functionality:
Phone - cell phone
Calculator - cell phone
Note taker - cell phone
Music player - cell phone
Camera - cell phone
Web browser - cell phone
Photo viewer - cell phone
Facebook - cell phone
Calendar - cell phone
Alarm clock - cell phone
Contacts - cell phone

Skeuomorphic is if you've got (a representation of) vinyl records on a shelf, and you're supposed to pick them out with a mouse, and place them on an RCA record player, and then pick up and place the needle on the record. Which is basically nuts.

On the other hand, there's no problem at all having old-style icons. I mean, as you so wonderfully parodied, what are they supposed to be? An iPhone?

Agreed. The problem is most people don't understand the reason WHY Metro is a horrible UI -- hint: it has to do with context.

When you have a "flat" UI you have no *secondary* _visual_ cues to tell you what you can interact with or not. You see this effect in many iPhone apps where they will have this absolutely beautiful graphics (and backgrounds) and you have no clue what the hell is an actual UI "widget" that you can push, slide, etc.

OSX Mountain Lion is starting to fall for this trap by hiding scroll bars. When I need to scroll, such as dragging the slider up/down, I can't even tell where it unless I first do a dummy scroll. This is retarded.

With a more "traditional" approach with *some* 3D elements such as drop shadows, beveled corners, these widgets "stand out" so we have a more natural intuitive sense to make the *critical* distinction between 2 UI elements:

* what is purely static which conveys information
* what can I interactive with.

UI is *supposed* to be about making it EASIER for users by *helping* them think less and act more by streamlining their judgement process. 3D Buttons are a perfect example of this: Users internally are thinking "Ah, here is a button I can push -- OK, what does it do? Does it do what I think it does? Does it do what I need it to do?"

ALSO note that TOO many 3D elements is a hinderance. 3D Studio, Blender, etc, are HORRIBLE UI's simply because they *overload* the user with too *many* widgets. It is an design art-form to maximize minimalism and minimize functionality. Sadly, too many UIX don't have a freaking clue about the fundamentals.

Without recognizing this deep contradistinction UI designers are completely screwing users over making them play the what-can-I-interact-with-game. This is 5 steps backwards. *sigh* Somebody everyone will realize we need to change the computer to fit US, instead of trying to change humans to fit the computer.

When you have a "flat" UI you have no *secondary* _visual_ cues to tell you what you can interact with or not. You see this effect in many iPhone apps where they will have this absolutely beautiful graphics (and backgrounds) and you have no clue what the hell is an actual UI "widget" that you can push, slide, etc.

Agreed. Or they don't follow their own gestural patterns they've instilled in people, such as the Calendar app in which you can't swipe through the months, but instead have to click on tiny arrows at the bottom of the screen.

Another motivation Apple designers have had recently is, "Let's get the clutter out of the way so users can focus on the content". In theory this sounds good, but in practice on Mountain Lion they have reduced the UI widgets in size drastically -- ML window buttons (the "traffic lights") or the thin scroll bars. They're also doing things like hiding UI elements, which we see this in the system scroll bars and in the chrome for QuickTime Player. This creates extra cognitive work for users, and IMO creates more of a net distraction than just having a tried-and-true fullscreen button. And I would imagine people with shaky hands or bad eyesight would have real trouble clicking on Mountain Lion's tiny interface elements. For me, I certainly have to concentrate more to make sure I click on the buttons.

Either the changes in these basic UI elements is the embracement of a disastrous design philosophy by Apple UI designers, or Apple is slowly trying to phase out the mouse completely.

It's been reported that internally there has been in-fighting over skeumorphism. I think the software UI design team at Apple is off the rails. They don't have a leader that understands the fundamentals, and Apple no longer has a leader like Jobs to tell them what works and doesn't. [To be fair, Jobs is the one who first pushed skeumorphism when he changed iTunes to the brushed metal look.] I think Jony Ive should take over leadership of the hardware and software design teams. When OS X was first released, its Aqua UI tastefully matched the hardware cues of the Macs available then. Now, it's a complete divorce. You have these sleek, intuitive forms that Sir Ive designs, and tacky, unintuitive software running on them.

>> "When you have a "flat" UI you have no *secondary* _visual_ cues to tell you what you can interact with or not.">But isn't that self-evident with a tile-based interface?It depends, but usually the answer is no. What are the visual indicators that some tiles can be pushed, flipped, moved, or rotated?Using a different lighting on widgets you can interact with helps the user at the subliminal level that they can interact with them, and they don't need to spend too much time thinking about how to i

Even under the most retarded configurations the control panel in windows is at most 2 clicks away. If you can't click twice to get to something the average user shouldn't be messing with (and if you've seen the average user, mac or windows, you'll agree to that point) then you shouldn't be in it anyway. I find whenever I'm on a mac I can't find shit, spending 20 minutes trying to find it, and usually end up having to open up terminal to make a change because I know linux/unix systems significantly better. That is not a criticism of the UI, any lack of ability to find something in the most efficient way is always 100% the user not knowing the system.

As you said, not knowing the OS is the problem. However, some things are clearly intended to distance OS X from Windows, while providing no usability improvement.Just the other day, I spent some 45 seconds trying to delete a file the Windows way. First I realized there was no Delete key, which is annoying even if it's not strictly the OS' fault. So I tried backspace, seems like a logical alternative when you want to delete stuff. Nope, so I tried right-clicking and all sorts of weird click+button combinati

As you said, not knowing the OS is the problem. However, some things are clearly intended to distance OS X from Windows, while providing no usability improvement.Just the other day, I spent some 45 seconds trying to delete a file the Windows way. First I realized there was no Delete key, which is annoying even if it's not strictly the OS' fault. So I tried backspace, seems like a logical alternative when you want to delete stuff. Nope, so I tried right-clicking and all sorts of weird click+button combinations I could remember.

Only then did I remember that you must absolutely drag files to the trash.

Why must the user click two keyboard keys for the same function that every other os (Yes, every other one.) only requires a single key press, and a single press is a logical expected command for it? How is that exactly the "superior" UI that keeps getting touted? That's intentionally making it more difficult on the user there, and is a very valid UI criticism, and there are obviously more that can be given.

Terminal rm is significantly easier to remember for those of us who are used to systems that work the

The problem with the one key DEL as opposed to the two-key Command+DEL, is that with the one key, it would be very easy to accidentally delete files. No big deal, right? You could just recover them from the trash... but that's only if you know you did that. Two-keys prevents that.

I'm also not sure why a single key would be expected anyway, when every other "command" is preceded by the Command key.

Accidental spacebar presses do not invoke a "destructive" command in the file browser.

On a modern Mac OS file browser (Finder) the spacebar invokes QuickLook, on Windows (Explorer) it selects a bordered file (or does nothing). Assuming you're not in filename edit context, of course.

The other issue with just using delete on files on the Mac is, you can't count on there being a "forward delete" key since even desktop Macs ship with the smaller, non-extended keyboard.

For the same reason, there is no filesystem "Cut" command on a Mac, via menu or shortcut keys. On Windows this is of course just the first part of a file-move sequence. It's inherently non-destructive so here, at least, Apple doesn't have good reason for excluding it.

Consider that a large portion of the world are using Windows, which lacks rm (although del serves the same purpose). Take even further that a majority of day-to-day Windows users have never come in contact with the CLI - they wouldn't know a del from a grep from an ifconfig from an ipconfig. I'm betting that Command+Del on Macs is more of a legacy thing - it's how it always worked in Finder, don't change it. Rather have the new users coming to OSX learn it the Apple Way then suddenly change it up on your ex

Open the System configuration pane, and enter a search keyword into the search field. The potential usefull applets get highlited then.Or do as noobs do: google:) even I have to do that sometimes when I comfigure something... after all I have several differen Mac OS X versions running and they slightly differ all

Even under the most retarded configurations the control panel in windows is at most 2 clicks away.

The Control Panel, yes... but not all the things you would expect to find there. For instance, I bought a notebook that had a very annoying "feature" where you can tap the touch pad to click. One would expect it to be under control panel-->mouse. It wasn't. I finally found it in an icon hidden to the right of the status bar at the bottom of the screen, and it took at least twenty clicks to disable the awful,

If you complain about stuff like that on a Mac, I cannot imagine how you would be trashing Linux for all its quirkiness.

Quirkiness? How? You realize, I hope, that there is no one Linux, there are many different distros of Linux. Gnome? I hate it. KDE? In what way is it quirky? It follows every convention I can think of, and if you're used to Mac or Windows (any flavor of either) you'll find it very easy to use; much easier than any version of Windows. Windows is a useability nightmare (I understand iOS is pretty useable, maybe even better than kde; I haven't used it).

Yet command-shift-4 space is a intuitive way to take a screenshot of a window?

OSX has its own pain points. Especially if you are coming from a X11 DE type background.I installed vlc for instance, and yet vlc from the command line gets me nothing. Just an example that is easy to fix, but I should not have too.

And you can put anything on your quick start/task bar for 1-click access in any version of Windows that came out in the last decade and a half.Untrue. Win7 does not allow anything other than applications (exe) to be pinned to the taskbar. There are workarounds, but they are painful and tedious.There are lots of other stupid un-features of Win7's task bar and start menu, but they've been discussed often enough in the past.

Actually, I found some of the webpages in the pics very well done. Let's take for example the burger menu webpage. It's simple, elegant and probably the best done choose what items you want on a burger, just because it's so simple.

I agree with the point that using faux object representations is cheap, wastes space, and can be lost on people for sure. But to go for Metro as an example of good design? Sorry, I'd take cheap wood and leather graphics with gradient overlays and shadow underlays any day of the week over that.

Agreed. I've not really played with Windows 8 much yet but I installed Visual Studio the other day and my first question was, why the flying fuck are the menus shouting at me in capital letters? Who ever thought that was a good idea and looked good or somehow improved the user experience?

The icons etc. look awful, the solution explorer which previously had nice familiar icons that you could often pick out from the hints of colour on them are now bold black lined pieces of fairly nonsensical shit.

Yeah, like those awesome sorted grids of icons which make finding that one thing you want dead simple.

Or those application docks which make it obvious to users how to open a second instance of an open application or switch between multiple open instances.

Perhaps you were referring to media library organizers which use a completely different set of metaphors and visual cues from the file system and are essentially incompatible making it less difficult when users want to interact with their file browser... somehow.

Yea, as a Linux user I never thought I'd be supporting a Microsoft app, but...well, I just started a new job recently that uses Outlook. And I never had much of an opinion on it until reading this...because it's never made me think about it. It's never gotten in my way. Gives me my email and my calendar available in a single glance, everything just works...I've got no complaints.

Well -- actually I do have one complaint: There's no 'minimize to tray' type function (at least not that I've found) so it's always taking up space on my taskbar...but otherwise, no complaints.

So many things to criticize about Apple's UI direction (the tabletization of OS X, for example), and they criticize the thing Apple is doing right.

People like old fashioned aesthetics. Nobody had a need to use a sundial these days, but many people still decorate their yards with them. Seeing a wood bookshelf with real books stacked looks pretty and people see it as part of Apple's software polish.

I was kind of opposed to the "tabletization" of OS X in the beginning, but now that I have used Mountain Lion for several weeks I have to say that it is a great idea. I enjoy Launchpad and the Notification Center a lot. Notes and Mail that behave exactly like my iPhone is a big plus, especially since Notes are like Evernote but much much faster. I really should mention Mail since I really thought that e-mail clients kind of hit a ceiling and that program proved me wrong. Reminders are ok but nothing spectacular. The deep App Store integration is also a good thing considering that OS X Lion and Mountain Lion breaks a lot of old software, Photoshop included, so when you get something from there you never have to wonder "will it run on my Mac?".

I think what Apple is doing wrong is breaking application support. I was very annoyed at not being able to use most of my games and a lot of software with the latest releases. I think when Apple was using Rosetta to run PowerPC programs they were doing fine. Once they took that attitude of "update your apps or else", it really made me appreciate all the hard work that Microsoft has done in that sense. I can still run a lot of old stuff in the latest Windows, and even the DOS applications can be run with a bunch of free emulators like DOSBox. There is no way to run an emulated OSX 10.2 or similar that I know of in a Mac.

Really? There's so much to criticize about Apple's design, like OSX's big and cluttered dock versus a tradicional taskbar, and they go straight for the superfluous fluff? Who cares about the icons? They are just fucking icons, replace them if you want to! What the hell happened to functionality in this world? It's like no one cares anymore, and "design" only means "making shit look fancy".

Not the point I was trying to argue, but ok: first of all, the dock isn't just a Mac issue. Since Windows 7 the taskbar has been "dockified", with pinned, textless icons (though it's easy to revert it to a more classic mode) and Linux has had all sorts of docks and taskbars for quite some time, now. My opinion isn't about OSs, it's mainly about docks. It just happens that OSX has not a lot of options other than its main dock, which leads me to explain my point of view: the taskbar is more useful because it

Seriously, as a designer myself I can only shake my head when I read stuff like this.

It may be true that "traditional visual metaphors no longer translate to modern users", but what about older users? Should we just dismiss their needs? Are interfaces really encumbered because they feature a wood-textured background?

Also, I challenge you to come up with a symbol for saving files without using a diskette or something like that. These symbols have transpired from metaphors of real objects to metaphors of actions, and people who have never even seen a diskette learn their purpose by context. Granted, this creates a certain standard by convention, and you could argue that any symbol could be used for that. But again, that would dismiss the users who grew up with that symbol. Currently, everybody is happy, why challenge this?

Imho, articles like this and blogs like skeu.it are just cleverly-disguised marketing by Microsoft. Ask any designer, and they'll tell you that well-used skeuomorphisms are not problematic, but even necessary to reach most of your target audience.

'Need' is quite an interesting term to use when discussing faux leather stitching on a calendar app.

Of course, I disagree with this UI designer as I think it is important to provide visual clues in an icon that denotes its purpose/function. If it helps people realize that the icon with a Month and the number 31 is a calendar, well, then it does serve a purpose.

I'm not knocking you, I just thought the concept of considering older person's needs when referring to this topic

Fair point, the use of the word need seems misplaced. English is not my native tongue;) What I wanted to express is the following.

I handed my 83yr old , technical-illiterate grandma an iPad and she was able to use most of the apps because they resembled physical devices she knew.

Of course she doesn't "need" to use a digital calendar, or even an iPad. But that device and ample use of skeuomorphisms are enabling her to participate in a lot of places which were inaccessible for her before. It makes a lot of people feel familiar with usually (for them) almost frightening devices.

This is empowerment, and as long as nobody else is hindered I think the debate is quite pointless.

Wood veneer IS wood. It's a more efficient use of the wood. FAKE veneer is printed paper. That I don't care for, mostly because it peels. Modern people aren't unused to seeing wood.

And please, brushed chrome? It's timeless - and it's metal. One hundred percent of the people I know are used to seeing chrome.

"One beneficiary could be Microsoft, where the design of Windows 8 distances itself from skeuomorphism by emphasizing a flat user interface that's minimalist to the core: no bevel, no 3-D flourishes, no glossiness and no drop shadow."

The icon that fucks me off the most is the one for the iOS Maps application. The US interstate route sign in the icon (ie route 280) makes absolutely no sense to anyone young or old outside of the United States. A globe or something similar would make more sense....

If you're on a road, then every road sign looks like a road sign. Off a road, however, an unfamiliar shape that looks like a shield is going to look like a shield. The only question is why "280" would be the heraldry.

Of course once you've gone completely flat and removed all the ornamentation, it makes one wonder where the next generation will go. Perhaps someone will suddenly realize, wow, we can make those tiles look just like a 3D image of a smartphone (and, of course, be promptly sued for rendering them with curved corners).

If OS X and iOS are bad then iTunes is a crime against humanity. And I think that's because the original program came from outside Apple [wikipedia.org].

I feel like Apple's UI can be compared to Disney's take over of animation stylings. Before Disney, you could find a whole variety of animation styles. But the vision of Disney was to make everything round and smooth and beautiful. Every animation cel was to look like a masterpiece portrait -- because that was the general populace's desired art at the time. And that's what Disney was trying to make, animated art. You might have found a sharp edge on a villain like Jafar in Aladdin but the main character would be round and warm. Others tried to mimic the stylings and it became a de facto standard mostly because it sold.

Similarly, Apple has done their UIs to be as beautiful as possible. And they've done it really well and it's expensive (I'd imagine both computationally and price). And both Steve Jobs and Walt Disney appeared to be this monolithic men pushing this new way (in reality it's probably a bunch of artists in a cohesive team) but they've both come and gone. And Apple clings to that vision but the vision never changes.

What happened to Disney was another production house, Nickelodeon, slowly discovered that square and rigid corners were not only acceptable but Spongebob Squarepants became an icon. Gross humor could be applied to shows like Ren & Stimpy and some people enjoyed this more than the safe beauty of Disney. Disney has no grit because Walt Disney wouldn't allow it. Disney got into disagreements with Pixar about Toy Story 2 and I think it is best if they left Pixar separate from Disney despite the acquisition. Similarly in the future Apple will be usurped by someone who is willing to experiment and deviate. Jobs is dead so Apple is committed to his vision... probably until they go under. They'll acquire new ideas along the way with their massive piles of cash but what happens when those visions are at odds with The Great Master who has transcended to Nirvana? That's still a long way off but these rumblings of criticism just show you can make another interface that is completely the opposite of Apple and actually do well.

Disney was established in 1923. Animation was in its infancy. Filmmaking was in its infancy. Such a statement needs clarification.

But the vision of Disney was to make everything round and smooth and beautiful. Every animation cel was to look like a masterpiece portrait -- because that was the general populace's desired art at the time.

Citation needed. Disney has almost ninety years of animation history with a range of divergent styles. I can't say what 1920's American looked for in its art, but I can certainly say that animation was a novelty at its time.

And that's what Disney was trying to make, animated art.

Again, citation needed. And also clarification... Disney the company? Disney the man? Disney the man started making shorts such as Steamboat Willie. 1928 [slashdot.org]. The point of this short wasn't to make art, but to entertain. Disney the company has been making a range of animated films for years of many different styles. All can be described as "art". Even Steamboat Willie.

You might have found a sharp edge on a villain like Jafar in Aladdin but the main character would be round and warm.

Now we are in the Eisner era. This needed to be noted at the start of the argument.

Others tried to mimic the stylings and it became a de facto standard mostly because it sold.

What others? And seriously... do you think Disney was the first to use lines, curves and edges as a way to depict stylistically character? That's a ludicrous statement which needs a citation.

That's just the first paragraph. It may make great banter for cocktail parties, but it means nothing.

Even though I don't think that skeuomorphism is the way to go about it, people just want something that looks interesting. They are also willing to pay for the cosmetic changes from version to version, so it makes business sense too. Pretty much everything goes through these stylistic trends. Clothes, cars, and other home electronics come to mind.

A second even though: even though I'm not big into fashion or appearances, I also want the computer screen to look interesting. The standard OS X and Windows 8 interface is a bit boring in my mind, simply because I am staring at it for hours a day and for days on end.

I think there's such a thing a 'over-skeumorphing', but I do find it serves a purpose. Those shelves might not be real shelves, but it emphasises that those icons are books, not apps or games or anything else. And by using the same stitched leather across the iPhone, iPad and Mac version of the calendar app, it emphasises that the data you put in is shared between these apps. Same for the Reminders app. And the Notes app.

I also think that having a strong visual identity for an app can make it more fun to look at and use, if that's your thing.

I admire the slickness of Windows Phone, but it just feels a bit too depressing, bland and clinical for my liking. I don't feel like I'm supposed to have fun when I'm using a Windows Phone.

I find this whole skeuomorphism thing to be tenuous at best. I'm 26 and have never used a rolodex nor a leather calendar book--and my phone hasn't looked like a corded handset since I was seven. But so what? I love the way all that stuff looks. There is a reason people go in for retro styles in the first place. We like that connection to the past. And to say that we are confused, simply because we're young is preposterous. We grew up on television. We've seen it all. Sure, we may laugh every time Jack McCoy picks up his tethered phone and flips through his rolodex to find another lawyer, but we aren't idiots. We know how this stuff works, and frankly I prefer the organic look of real objects to the sterile hospital environment of Google's design team. Just because the thing is digital does not mean it should look like it was designed for a Star Trek shoot.

"Skeuomorphism" irritates me as much as "cloud" and "mash-up" before it. The simple term is "metaphor", the pre-2012 standard term for this approach to UI. Are the anti-skeuomorphismists proposing that every GUI OS now give up the folder metaphor? You know, underneath they're "directories".

I'm guessing the objection is to photo-realism of the metaphor rather than the metaphor approach itself. Showing a 24-bit image is like having the joke explained to you. It also adds frustration and cognitive dissonance when the metaphor, which is an anology after all, breaks down -- when it doesn't operate exactly like what is being portrayed in 24-bits. Then, it's not just having the joke explained, it's also a bad pun.

When desktop UI metaphors are rendered in 1-bit, they take on a suggestive and less specific meaning, and the user understands them as hints and the users do not rise up in rebellion with endless trade articles about "skeuomorphisms".

First off, there are many types of metaphors, so from a purely taxonomical point-of-view, there's nothing wrong with the introduction of a word that describes a particular type of metaphor. Secondly, it's not necessarily about metaphor. Skeuomorphism refers to the practice of carrying over unnecessary elements from one version of a product to another (in most cases, talking about going from an analog or physical object to a digital replacement). For example, the click that you hear when your phone's camera

Nothing makes you think you did a great $500+ tablet purchase than looking at a minimalistic interface. And is it such a bad thing for your child to ask why an icon looks the way it does. Nothing wrong with silly 'Back in the day books were actual tangible objects and bound in leather. In fact your crappy plastic car interior is emulating it!'.

The thing that I find very strange about Apple's UI peoples' obsession with ultra-tacky stitched leather borders, disgustingly twee fake paper calendars, little 'wooden' shelves for ebooks, and similar rot, is how sharply it differs from their hardware guys...

On the hardware side, Apple's aesthetic is one of a practically brutalist honesty to their materials, and a fairly relentless drive to unify surface and structural elements(ie. aluminum unibodies, rather than ABS-clad magnesium or steel skeleton designs, that sort of thing). It is really quite jarring. Their hardware guys appear to be iterating toward the monolith from 2001, and then you turn the device on and *BAM* punched in the face by '90s shareware UI...

Touching objects on a screen that look visually like what the physical representations of the function being peformed used to look like before we had PDAs and smartphones is ludicrous!

I'd much prefer a CLI so I can type "cd/usr/bin" then "./phoneapp dial -domestic +13125551212" whenever I want to make a phone call and "./phoneapp hangup -log/var/log/calls.today" when I hang up and want to add the details to a log file. That's much easier for me to understand, and should be self-explanatory to anyone if they just read the command.:)

If that's just too hard for some people, I guess we can have a GUI with red and green icons with antiquated pictures of analog handsets on them, for now. But those should eventually be deprecated in favor of some newer, more modern representation of what a phone looks like.

rely on an ancient methodology people won't be familiar with (the Rolodex example is a good one)

break the metaphor (infinitely long three-ring-binder pages)

forcing the metaphor by withholding obvious shortcuts (requiring a separate pencil eraser tool to be selected, when Undo or backspace would suffice)

Non-skeuomorph apps have the same kinds of problems in many cases. Fat margins, "iced" or unstretchable dialog box layouts, inability to copy pretty much any visible text to the clipboard, flat coloring that lets different entities to merge.

I haven't found my ideal window manager yet. It seems like 99% of the mouthbreathing userbase likes fully sovereign/maximized applications. This breaks down on massive displays. It seems like a lot of people like magnetic window edges that "help" align things neatly and nicely at all times. I'm the opposite, I like windows to be scattered and different sizes, and if there are just a little too many, to be overlapping such that no borders line up. This is almost a skeuomorph of a desktop where different papers overlap generally but never exactly.

People seem to go on about making flatter colors and simpler framing, but I like the visual cues of shading and shadow, of increasing or decreasing contrast to draw attention. The Metro stuff looks like a wall of sample paint chips you see in Home Depot, or the funny hospital triage menu interface in Idiocracy. No, I don't want to run "Afternoon Eggshell Delight" nor do I want to have to hunt the wall for it.

Sturgeon's Law: 90% of everything is crap. That includes examples of skeuomorphism. However, that's not the reason to throw it out.

This is just a pet peeve of an editor and not of general interest. Skeuomorphic design isn't inherently evil for users, it's just that a lot of UI designers get annoyed when people ask for it and they can't try their less constrained designs. I sympathize with backlash against the plebian scum of the business world, but they are also their customers. This is an attempt to convince people that these designs are more objectively bad in order to have more firepower to resist them when they are requested.

I'd have to say I don't think it's such a bad thing, nor does it set a new precedent by any means. This type of things happens in languages (human) all of the time. In English, we still use words and phrases such as "he is in the lime light." How many people actually know that that refers to what they used to light stages with back in the early 20th century? Should we replace this phrase because it refers to something most of us have never physically observed? Of course not. Yet, some things in language evo

Why? Because MOST of the users (and potential) out there understand these things. You can make an interface bare and functional, eschewing all references to physical or mechanical analogs, and you will make the computer literate crowd swoon. Compare that 2-4% market share with the "rest" of the population, and you can see why people are buying this "poorly designed" Apple hardware. It's familiar and comfortable - and it's (mostly) well-done and visually pleasing from a graphic design standpoint. Never, ever

I don't think either implementation makes the applications easier to use. They seem to have been done for no other reason than "we can".

Mountain Lion's implementations aren't as awful, adding back most of the 10.6 functionality to iCal and making Address Book usable without constantly clicking between screens. However, they've gone this far, it would be trivial to remove the stitching and faux le

I think it is clear that Apple is a hardware company FIRST, then a software company. If Apple applied some of their hardware design principles to their UI design, we would be seeing some highly evolved and hopefully massively well received UI design. I think people want to use Apple's hardware and simply have to put up with their software, and of course assume the software must be on par with the quality of the hardware.

Considering how much evolution has been seen in Apple's hardware over the last 10 year

I've come across no UI design that is perfect but these guys pick something that Apple actually does "correctly" and as well as trying to cite Windows 8 Metro as the better way to do it which is extraordinarily dubious stance to take???

I am not a professional UI designer but from the things I've learned about skeuomorphism is that skeuomorphisms are powerful when used correctly. For instance: Present a group of people a large cornflower blue square and ask them "What is this used for?" and you will get a lot of different answers (an output area, a blank picture, an empty container, no idea). Present the same group of people a square with a wood grain texture and ask them "What is this used for?" and many will immediately gravitate to "this looks like a flat wooden surface" and often calls up "an area I can put other things on". Even though functionally both the blue square and the wood texture square can be coded to the same thing, the texture adds a skeuomorphism that gives a big hint on the function.

Now look what was just pointed out here with Metro and the various gadgets found on Mac OSX. I think it is dubious when people are looking a colored square with text as "better" than a something that looks like a notepad with a check list on Mac. There are drawbacks to doing that way on the Mac but it sure as hell isn't "confusion"!

Keep in mind that Apple's early slogan (for the Mac, anyway) was "The computer for the rest of us". That "rest of us" bit referred to the folks who weren't computer geeks who loved the command prompt. In order to make the Mac welcoming, they tried to use plenty of metaphors which were already ingrained into the minds of potential users. Heck, even the very idea of a desktop is like that, where you pick stuff up, set it down somewhere else, windows overlaying like sheets of paper. The point is, Apple seemed to try (more than their competitors, at least) to create as many "Oh, this is just like what I do in the physical world, already!" moments as possible, so that, from first use, the user found the Mac to be familiar and welcoming.

Now... it sounds like the argument being made is "Yeah, yeah... but those days of the never-bought-a-computer consumers are over. Now that we've got them on-board, let's start cutting those ties to meatspace". However, to do so makes me immediately think of Photoshop. If you started with Photoshop when it was version 1.0, and if you grew up with the gradual addition of features as they appeared in the many versions, then you're fine. Frankly, I weep for anybody who has to learn today's Photoshop from no previous experience with it. About a decade ago, Adobe, itself, realized that they had this problem and they came out with Photoshop Elements (and you can make the same argument with Premiere and Premiere Elements) as an intermediate step to get users acclimated to Photoshop paradigms without just throwing them into the deep end of the pool. (For those about to argue that Photoshop Elements was, instead, an attempt to tap into a "pro-sumer" and amateur market which was priced out of Photoshop... yes, it was that, too... but it wasn't all that, or else Elements would have just been Photoshop with a bunch of the powerful features taken out. Instead, Elements had a bunch of UI changes which made it easier to use; there was now a red-eye removal button, instead of having to lasso or magic-wand and then use a spot-healing tool or whatever. It introduced the user to being able to successfully manipulate pixels, without the learning curve being way too steep.).

So, that's what I think of when I see the calls for Apple to abandon skeu... that the ship is full of passengers and it's time to shove off, take those passengers to further shores, and leave the rest of the folk on the docks. And I think that's a departure from what Apple has always tried to be.

Lastly, I gotta say... I grew up with MS-DOS... did 8086 and 6502 assembly... nuts-and-bolts stuff. I hated Apple with a passion for years as being "foofy". Nowadays, however, when I play with my iPhone or iPad, I find all of the real-world metaphors in the UI to be very heartwarming. The stitching on the leather in the Notes app... I look at that and it's a little like sipping hot cocoa.

My biggest problem with OSX has always been the lack of Alt- sequences -- key-stroke sequences that allow you to access menu items without touching the mouse. As a power-user of at least two productivity applications (Word and Excel), I have forever avoided *unnecessary* mouse usage by memorizing my favorite sequences like Alt-e-s-t (Paste->Special->Formats). My use of these applications is, frankly, bewilderingly fast (pat, pat), in the eyes of users who use drop-down menus to access these same functions. If you have never seen someone use Excel without ever touching the mouse, you should: you will learn something about user experience and interface efficiency.
In *some* previous versions of OSX you could turn on alt-sequences. Others, not -- I bought a used MacBook Pro in 2005 and couldn't figure out how to get these to work after ~10 hours of research, so I resold it a month later. I frankly don't use Macs enough to know whether it's easy to do this now, but from casual use I know that it isn't available as a default, which is silly, whereas it is on Windows. And thus Windows encourages developers to include these sequences, which is a real boon for every app where they work.
Mice are great, but they are slow! Why would you ever want to aim three clicks when you could type four letters? Imagine if you had to type text in Word, Excel, VS or Eclipse by clicking an on-screen keyboard with your mouse... you'd probably just give up and write with pen and paper (or a manual typewriter), and hire some low-wage laborers to do all that slow, boring clicking. That's how I feel when I use Excel on a Mac.