Blind, Deaf, Dumb & Broken Computer Metaphors

In Second Nature: Brain Science and Human Knowledge, neurobiologist and Nobel Prize–winner Gerald Edelman1 theorizes that…pattern recognition and metaphor is the basis for all thinking. When we discover something new, unknown or abstract, we use metaphor as a bridge from the old, that we understand, to the new, that we want to understand. Metaphor, then, is not just a way to communicate, a way to teach, it may well be the very way we think.

Definition and Purpose

In mathematical terms, metaphor is the flat assertion that A = B.

In Romeo and Juliet, Shakespeare used the sun as a metaphor for Romeo’s love of Juliet:

But soft, what light through yonder window breaks? It is the east, and Juliet is the sun.

Now obviously, Juliet is not literally the sun; A is not literally B; so what do we gain by connecting these two seemingly unconnected things?

Metaphor makes an implicit comparison — an intuitive perception — of the similarity in dissimilars. It captures a key aspect of one thing by relating it to something else. Metaphor is so essential to our being that it is impossible to describe emotions, abstract concepts, or complex ideas without it.

The job of the metaphor, then, is to find similarities in unlike things, to teach us the new by building upon the old. Its primary purpose is to carry over existing descriptions to things that are so abstract that they cannot be otherwise explained.

A metaphor is a kind of magical mental changing room–where one thing, for a moment, becomes another, and in that moment is seen in a whole new way forever. ~ James Geary, “I Is an Other”.

Computing Metaphor

The early computer makers recognized our need for patterns and created metaphors to help us bridge the gap between the old and the new; between what we knew about the world that we lived in, and what we needed to know in order to successfully navigate the new world of computing. The better the metaphor — the better the connection between the old and the new — the better the user experience.

BAD, BAD METAPHOR This jet plane user interface is an example of a VERY bad metaphor because there is virtually no metaphor at all. The light switch and the radio switch and the ejection seat switch are exactly the SAME and are located inappropriately close to one another which is exactly the OPPOSITE of what one wants a metaphor to convey. One tiny mistake or slip of the hand and — whoosh!

DESKTOP METAPHOR

The computing metaphor that emerged from the 70’s and 80’s was the desktop metaphor

Steve Jobs described it this way:

The desktop metaphor was invented because one, you were a stand-alone device, and two, you had to manage your own storage. That’s a very big thing in a desktop world.

Presciently, Jobs then added the following:

And that may go away. You may not have to manage your own storage. You may not store much before too long.

From 2000 until…well…until today, Bill Gates and Microsoft tried to discover the proper metaphor for the tablet.

— They had the right form factor. The tablet was, in fact, the future of computing. — They were innovative in their use of the stylus as an input device. — But they had the metaphor all wrong, wrong, wrong.

Microsoft persisted in imposing the PC desktop metaphor onto the tablet, long after it was painfully clear that it was the wrong thing to do. A desktop metaphor works well on a desktop and even a notebook computer. However, it is a disaster on a tablet.

Why? The metaphors of menus, scroll bars and tiny buttons all work well on a desktop device because the mouse can easily locate and click on a single pixel. But on a tablet, mice were impossible to use and stylus input was clumsy, at best.

TOUCH METAPHOR

Touch input removed an entire layer of abstraction from computing input, but it also required the creation of an entirely new user interface, built from the ground up.

The finger, unlike the stylus, was imprecise and hit multiple pixels at once. The finger, unlike the cursor, obscured one’s view of the screen.

What to do, what to do?

— Menus? Too, too small. Replace them with large buttons. — Scroll bars? Too narrow. Replace them with swiping, up and down, left and right. — Tiny buttons? Fugetaboutit. Replace them with huge, easy to see and easy to touch, targets.

Blind, Deaf, Dumb & Broken Metaphors

Bad metaphor creates a cognitive burden, a “tax” on the brain. A bad metaphor won’t kill a user interface but it will maim it and cause it to painfully limp along. Computing should be enjoyed, not endured.The better the metaphor, the easier the computer is to use and the better the user experience. The worse the metaphor, the worse the user experience and the less likely that the device will be embraced by a mass audience.

PHYSICAL MIXED METAPHORS

(Windows 8 is) like driving a car that has both a steering wheel and a joystick. ~ Michael Mace questions Microsoft’s sanity

Physical mixed metaphor is so obviously a bad idea. It is so totally impractical that, for the most part, it seldom exists outside of the lab.

VERBAL MIXED METAPHORS

A verbal mixed metaphor is a succession of incongruous or ludicrous comparisons. When two or more metaphors (or cliches) are jumbled together, often illogically, we say that these comparisons are “mixed.”

Stick these examples of mixed metaphors in your pipe and chew them over:

Mr. Speaker, I smell a rat. I see him floating in the air. But mark me, sir, I will nip him in the bud. ~ Boyle Roche in the Irish Parliament

The walls had fallen down and the Windows had opened, making the world much flatter than it had ever been–but the age of seamless global communication had not yet dawned. ~ Thomas L. Friedman

The moment that you walk into the bowels of the armpit of the cesspool of crime, you immediately cringe. ~ from Our Town, N.Y., cited by The New Yorker, March 27, 2000

And my favorite:

All along the untrodden paths of the future I can see the footprints of an unseen hand. ~ Sir Boyle Roche

Less Can Be More; More Can Be Moronic

Never assume the obvious is true. ~ William Safire

We all know that physical and verbal mixed metaphors are a bad, bad idea. Why then, is it so hard for us to recognize that mixed computing metaphors are a dangerous drain on our cognitive abilities too?

I guess it just seems obvious that two is better than one, that more is better than less. But when it comes to metaphors, nothing could be further from the truth.

There is great power in a consistent metaphor. In fact, it is worth sacrificing computing power (and its underlying complexity) if that power comes with at the cost of metaphor. This is the great paradox that tech pundits fail, over and over again, to comprehend.

Examples of Blind, Deaf, Dumb and Broken Metaphors

Universal Operating Systems ( A Chimera, in Greek mythology, is a fire-breathing female monster with a lion’s head, a goat’s body, and a serpent’s tail. It also means “a thing that is hoped or wished for but in fact is illusory or impossible to achieve.” That exactly describes and embodies the fantastical wish for a single operating system. A single operating system that runs on touch input devices (phones and tablets) as well as pixel specific input devices (notebooks and desktops) is as absurd has putting a lion’s head and a goat’s head on a single animal.)

Conclusion

My advice is to focus on the metaphor first. If the metaphor is not intuitive to a 6 year old – or a grandmother — or even a gorilla, it may be too complex.

Two final thoughts from sources as diverse as Sesame Street and Warren Buffet:

One of these things is not like the others, One of these things just doesn’t belong, Can you tell which thing is not like the others By the time I finish my song?

Should you find yourself in a chronically leaking boat, energy devoted to changing vessels is likely to be more productive than energy devoted to patching leaks. ~ Warren Buffett

MY PREDICTION

It is a test of true theories not only to account for but to predict phenomena. ~ William Whewell

I’m putting my metaphorical money where my metaphorical mouth is, and flat-out predicting that NONE of the above hybrid operating systems and hardware options will go mainstream. Oh, they may well survive, but none will thrive. That’s my story and I’m sticking to it…

John Kirk

John R. Kirk is a recovering attorney. He has also worked as a financial advisor and a business coach. His love affair with computing started with his purchase of the original Mac in 1985. His primary interest is the field of personal computing (which includes phones, tablets, notebooks and desktops) and his primary focus is on long-term business strategies: What makes a company unique; How do those unique qualities aid or inhibit the success of the company; and why don’t (or can’t) other companies adopt the successful attributes of their competitors?

“”a truly magical and revolutionary product.” -Steve Jobs Yeah, he really said that. What’s a revolution without some magic, next version will have voodoo.

“If you see a stylus, they blew it”-Guess Who??? Yes, we believe in fingers. No more knives and forks. If it can’t be eaten with fingers, it’s not worth eating.

“Android slavishly copied”-SJ Yet still manages to suck!

And let’s not ever forget…”Don’t hold it that way” -Do I have to say it? Because there are approved ways of using your fingers.

FalKirk

‘Now obviously, Juliet is not literally the sun’ –Nah… she’s not that bright! ~ klahanas

Mayhaps, Romeo was saying she was that hot.

klahanas

Read on a wall somewhere that if you go with her, you get burned… 🙂

Kizedek

Yeah, he always thought that the only thing that could keep them apart was an astronomical unit.

marcoselmalo

“Yes, we believe in fingers. No more knives and forks. If it can’t be eaten with fingers, it’s not worth eating.”

What food physically requires a knife, fork, or spoon for its consumption? The argument Jobs was making is that it is a mistake, a half measure, to require a stylus to operate a tablet. Thus, on an iPad, a stylus is an *optional* input device when more precision is desired (generally in drawing apps). Similarly, a keyboard is an optional input device to achieve greater speed and accuracy for those trained on physical keyboards.

klahanas

Soup!

jfutral

You haven’t lived until you’ve had a mug of hot pudding. No spoon required. Seriously.

Joe

marcoselmalo

You can drink soup straight from the bowl. Indeed in many cultures this is quite normal. In the case if fondue, you are using the same instrument (the fork) to eat with as you are using to prepare it. You can quite easily remove the cheese covered bread from the fork and eat with your fingers. Porridge can be slurped from the bowl if thin and eaten with fingers if thick. Etc, etc. Eggs, same. Roasted marshmallows, same as fondue. Pudding? Apparently you’ve never enjoyed “licking the bowl”, which doesn’t even require fingers.

None of the foods you mention require an eating utensil to consume, if we set aside polite custom. If you can think of a food that absolutely requires a knife, fork, or spoon (or chopsticks), I’d be most interested to know. Please remember, we are separating custom from function.

I agree. I just loved it so much, I couldn’t help throwing it in. My bad.

Kizedek

Unless it brings some little touches to the mix, like your wastebasket becomes a set of bomb bay doors or something.

youbet

It’s sandpaper AND facial cream!

stefnagel

All along the untrodden paths of the future I can see the footprints of an unseen hand. ~ Sir Boyle Roche Sorry. Just had to repeat it. Ha.

FalKirk

So, so good. Just loved that one.

Mayson

“Why we should put ourselves out of our way to do anything for posterity, for what has posterity ever done for us?” Ibid.

stefnagel

Ha

klahanas

So you could be a “Sir” and still be a moron. There’s hope for the rest of us… People make their titles, titles don’t make people.

Kizedek

I tend to think that being British he was just displaying the national traits of subtle irony and self-deprecation: Elements of humor that much of the world completely misses, more often than not. We collectively apologize that these elements don’t come with a built-in facial expression, a biting tone of voice or other signal like an “s-tag” that immediately identifies them as irony — signals which identify the less subtle subset called sarcasm.

Two “faults” in one sentence (see/unseen as well as footprints/untrodden) makes taking it at face value a little incredulous, even if he is the most inbred of British royalty, don’t you think?

klahanas

I can definitely see that point of view. It’s certainly possible it was the case. My own disdain for pretense (“Sir”) strongly influenced my perception. I was thinking of Dan Quayle with a title… 🙂

Kizedek

Ah, that’s an image I’d want to forget. There’ve definitely been some good choices for conferred titles — like Sir Tom Jones and Sir Jony Ive.

klahanas

Personally, being a non-Brit, I have the privilege of only acknowledging knighted rock stars. 🙂

Jones, Geldof, McCartney, etc…

stefnagel

Good point. But it seems Roche was well known for his malapropisms. Here’s another beaut: “Why we should put ourselves out of our way to do anything for posterity, for what has posterity ever done for us?”

stefnagel

Metaphors are key to thinking about Apple products. Spot on.

Years ago, Ikujiro Nonaka did work on tacit vs. explicit knowledge, which included the use of metaphors, much like koans, used by Japanese companies, e.g., Panasonic. Digging around, I see Nonaka did a paper on Apple 1.0 and Canon back in 1991. Here’s the money quote: “… most important in the innovation process is the problem creation moment. That is, the positing of the correct problem, which allows the solution to be discovered …”

So what we solve is based on what we are looking at. Apple looks at reading, browsing, drawing, writing, visualizing, imaging, printing and comes up with Macs and now iOSmacs. Google and Samsung look at Apple and come up with Android and Galaxy. Huge dif.

klahanas

See, here’s where I get skeptical regarding innovation management. Yes, at some point it must be managed. At the end. Innovation and management are almost antithetical. One breaks rules (if done right) the other tries to impose them. Here’s where Jobs was justifiably successful and praiseworthy. Innovations controlled operations, not the other way around. Some of the most ridiculous things he did (painting machines white) reinforced who was the boss.

stefnagel

Apple’ long and successful history begs to differ. Nonaka believed Apple had learned to manage innovation by 1991, and the same principles hold today. Apple, as Jobs said it, worked to integrate technology with liberal education (writing, learning, communicating, etc.) Apple competitors work simply to integrate technology with marketing 101.

klahanas

And how is that inconsistent with what I said about Jobs?

stefnagel

Right. I think Cook is proving that Apple’s ability to manage innovation extends beyond Jobs. When Leander Kahney writes the Cook book, we will find his, Cook’s, presence and guidance began well before Jobs’s retirement.

FalKirk

Thanks for that. I’ve clipped it for possible future use. 🙂

stefnagel

Hope you do. Nonaka’s work is worth unpacking.

James King

“Universal Operating Systems” “Keyboard on a tablet” “Touchscreen on a notebook or desktop”

John seems to be guilty of doing the very thing for which he is against, mixing metaphors.

Input devices serve not one but two roles, navigation and actual input of information. The problem is when those two roles are not properly delineated. An operating system can easily be designed to utilize each input method in its proper context, therefore creating no conflict when it is utilized over many different form factors.

Let’s look at the iPad:

1. It’s touch targets are definitely large enough for effective mouse use. Indeed, iOS is even MORE well designed for pixel precise navigation because it has much larger targets. So any OS designed for touch is just as usable for pixel precise navigation. The advantage of touch, however, is speed. It’s much faster for navigation than a mouse or trackpad. But the relationship is not inverse by nature. Schemes designed for pixel precise navigation are not suited for touch, but that does not make the opposite true. In fact, very much the contrary. The much larger targets of a properly designed touch interface are actually BETTER suited for pixel precise navigation;

2. I don’t think anyone who uses a keyboard with the iPad would call it a degradation of the experience. To the contrary, in fact. What makes keyboard use on an iPad an exceptional experience is that it is used exclusively for the task it is best suited, text input. The modal nature of using a keyboard for navigation as well as text input is completely avoided on the iPad. Even still, the universal search on the iPad is particularly effective for keyboard navigation if one needs it. One of the most lauded features of the Blackberry was its ability to utilize universal search simply by typing on the keyboard. With universal search, a keyboard becomes an effective navigation tool, particularly on the iPad;

3. If you use Dropbox, Box, SugarSync, etc. on an iPad, you have access to the very “desktop” metaphor that is utilized on the PC. Considering the popularity of those services, it’s tough to make the case that tablets eschew the desktop metaphor. To the contrary, I doubt the iPad would be as usable as it is without those surrogate file systems.

I understand John’s criticism but they are the result of exceptionally poor execution on the part of Microsoft and others, not inherent issues. The key isn’t necessarily finding the right metaphor, but not mixing them. Operating systems do a poor job of reducing modes and treating input devices as discreet mechanisms. Touch when touch is best, mouse when mouse is best, keyboard when keyboard is best, etc.

So I completely disagree that there cannot be a “universal” operating system and the facts don’t support such a premise, especially if the assumption is that a tablet can be an effective replacement for a PC.

klahanas

Brilliant. You know, I was very skeptical when Bill Joy said “The Network is the Computer”. It’s taken me a long time, but I’m starting to get the overall scope of that statement. On the other hand, I am still a huge fan of local storage over the cloud. For me, the cloud is my syncing drive.

James King

I agree re: local storage vs. the cloud. I’d like to see some better innovation in the NAS space. Prices could be better too.

steve_wildstrom

Western Digital’s MyCloud does a nice job of simplifying the NAS experience.

FalKirk

“Schemes designed for pixel precise navigation are not suited for touch, but that does not make the opposite true. In fact, very much the contrary. The much larger targets of a properly designed touch interface are actually BETTER suited for pixel precise navigation.”

Disagree. Swiping is unnatural for a mouse (though better with a trackpad). And touch buttons are a huge waste of real estate for a pixel specific device like a mouse or stylus. No one wants to use a mouse on a tablet or phone.

“I don’t think anyone who uses a keyboard with the iPad would call it a degradation of the experience.”

You just both missed and made my point. A keyboard increases the FEATURES available to a tablet at the cost of degrading the METAPHOR. A tablet with a keyboard is a lessor notebook. A tablet without a keyboard is capable or working without a surface to rest upon. All of Microsoft entirely missed that key truth so I suppose you can be forgiven for missing it too.

“If you use Dropbox, Box, SugarSync, etc. on an iPad, you have access to the very “desktop” metaphor that is utilized on the PC. Considering the popularity of those services, it’s tough to make the case that tablets eschew the desktop metaphor. To the contrary, I doubt the iPad would be as usable as it is without those surrogate file systems.”

Wow, you just insist on not getting it. All of those services enhance and extend the usefulness of the iPad but they are all edge cases. Perhaps 1% of users take advantage of them. The other 99% are just fine without them.

Don’t let the tail wag the dog. What makes the tablet is successful is how easy it is to learn to use. The edge cases are the exception to the rule, not the rule.

“So I completely disagree that there cannot be a “universal” operating system or effective PC “hybrids” and the facts don’t support such a premise, especially if the assumption is that a tablet can be an effective replacement for a PC”

The tablet is not replacing the PC. It’s unbundling the services provided by the PC. A knife can be used to eat with, but a fork does some of the work better. And a spoon is far superior when it comes to eating liquids. The PC, like the knife, was once upon a time the only tool we had so we used it for all things. But over time, we developed the spoon and the fork and the smartphone and the tablet. The tablet is not replacing the PC anymore than the fork is replacing the knife. It’s just doing a job that the knife used to do and doing it better.

James King

“Disagree. Swiping is unnatural for a mouse (though better with a trackpad). And touch buttons are a huge waste of real estate for a pixel specific device like a mouse or stylus. No one wants to use a mouse on a tablet or phone.” – Falkirk

Touch can easily co-exist with pixel precise navigation methods. That was my point and a refutation of what you implied otherwise. And if you understood anything about UI design, you would know why larger targets are better even for pixel precise navigation. The two are not mutually exclusive, one actually complements the other.

As for no one wanting to use a mouse on a smartphone or tablet, reductio ad absurdum.

“You just both missed and made my point. A keyboard increases the FEATURES available to a tablet at the cost of degrading the METAPHOR. A tablet with a keyboard is a lessor notebook. A tablet without a keyboard is capable or working without a surface to rest upon. All of Microsoft entirely missed that key truth so I suppose you can be forgiven for missing it too.” – Falkirk

I didn’t miss the point, it is a non sequitur. As a former lawyer, I’d expect a more cogent argument but I guess you can be forgiven considering your lack of knowledge re: technology considering your background.

What makes a tablet with a keyboard “a lessor notebook”? The smaller screen size, not the keyboard. I guess you need to be an expert to understand the difference.

“Wow, you just insist on not getting it. All of those services enhance and extend the usefulness of the iPad but they are all edge cases. Perhaps 1% of users take advantage of them. The other 99% are just fine without them.

Don’t let the tail wag the dog. What makes the tablet is successful is how easy it is to learn to use. The edge cases are the exception to the rule, not the rule.” – Falkirk

My point was to show that the metaphors can be effectively applied without degradation of the experience, which is what I clesrly showed. Your premise is incorrect. Simple as that.

As for my examples being “edge cases,” I’m sure your stats are completely made up but I’ll bite. When it comes to iPad usage, I’d venture that a much greater percentage than 1% use Dropbox or another similar service. We are transitioning to “the cloud” after all.

As far as tablets being easier to use, thanks for pointing that out. I would have never known that if you hadn’t made it clear. :/

“The tablet is not replacing the PC. It’s unbundling the services provided by the PC. A knife can be used to eat with, but a fork does some of the work better. And a spoon is far superior when it comes to eating liquids. The PC, like the knife, was once upon a time the only tool we had so we used it for all things. But over time, we developed the spoon and the fork and the smartphone and the tablet. The tablet is not replacing the PC anymore than the fork is replacing the knife. It’s just doing a job that the knife used to do and doing it better.” – Falkirk

And now we are back to the non sequitur: A tablet is a tablet and a PC is a PC and never the twain shall meet. The problem (for you) is that they have indeed met and I used your beloved iPad to make my point. There is NO degradation of the tablet experience when aspects of the traditional PC model are implemented into it when it is done correctly. You can still use an iPad effectively as a tablet, even when you use it with a keyboard and file system.

Your premise that a universal OS or hybrid PC is unfeasible simply isn’t supported by fact.

Slowmind

Looks like you are with Microsoft’s thinking. Keyboard is just add-on, the dressing of the cake, not the cake itself. It is touch that makes iPad so universal, so powerful! John’s touch metaphor is so spot on!

James King

Yea iPad! :/

steve_wildstrom

Microsoft tried really, really hard to make Windows 8 a universal operating system. The fact that they have failed–and by that I mean that the market has rejected it, not that I do not like it–makes a pretty good case for the impossibility of the thing. Microsoft has really good UI engineers and designers and still Win 9 feels like an awful compromise on both touch and WIMP systems.

By the way, iOS 7 has exactly the kind of universal search you are talking about. Swipe up from the middle of any OS window and you get a search box.

James King

“Microsoft tried really, really hard to make Windows 8 a universal operating system. The fact that they have failed–and by that I mean that the market has rejected it, not that I do not like it–makes a pretty good case for the impossibility of the thing. Microsoft has really good UI engineers and designers and still Win 9 feels like an awful compromise on both touch and WIMP systems.” – Steve Wildstrom

This is a logical fallacy. The same reasoning could have been applied to the iPhone or iPad, that because the incumbents couldn’t do it, it couldn’t be done. That’s self-justifying.

My point was to show that Android and, particularly, iOS are already very close to being “universal” OSs. They can both be used effectively with keyboards and (theoretically) mice and file systems. There is nothing that is done using a PC that can’t inherently be done on a tablet just as effectively (provided the screen is large enough) using the same basic metaphors. If you want to be completely accurate, the iOS and Android primary UIs are just large grids of icons, not that much different than the paradigm of the last 30+ years, just sized properly for touch. Sinofsky’s thinking re: Windows 8 was based on the premise that obviousness is impossible under any circumstance and that all behaviour re: software is learned. Windows 8 reflects that backwards thinking.

BTW, I was already aware of the universal search feature in iOS 7, that’s why I used universal search as an example of an effective UI element that benefits from a keyboard. Not my intent to be snarky though, thanks for the info.

steve_wildstrom

I like iOS precisely because I don’t think it is anywhere close to successful as a universal OS. I use a keyboard when I am 1) writing something of length and 2) have a flat surface to hold both keyboard and iPad. But it is a seriously compromised experience because you cannot control the OS completely from the keyboard; it has to be a mixture of keyboard and touch, and it is awkward. Support for a pointing device (which Android has) would ameliorate this, I guess but the fact is the touch iOS interface is, at best, extremely inefficient for pixel-precise control–and you still need an alternative for swiping.

The file system is another issue. I hope that at some point Apple comes up with a solution that maintains the security protections of iOS while offering greater interprocess communication. It is not an easy problem to solve,and, for the time being, we are stuck with a bunch of kludgy solutions. Communication, more than the file system per se, is the problem and it is another major impediment to using iOS as a universal OS.

There is far more to the UI than the screens of icons. The Cocoa Touch APIs (and whatever their equivalent in Android is called) provide a rich set of tools that developers use to create the controls for their apps.All of these are optimized for touch, not pixel-precise controls.

James King

“But it is a seriously compromised experience because you cannot control the OS completely from the keyboard; it has to be a mixture of keyboard and touch, and it is awkward.” – Steve Wildstrom

I’m not sure this would be so much of an issue for someone who doesn’t use keyboard shortcuts. But it’s really not a point that can be argued.

“Support for a pointing device (which Android has) would ameliorate this, I guess but the fact is the touch iOS interface is, at best, extremely inefficient for pixel-precise control–and you still need an alternative for swiping.” – Steve Wildstrom

As long as the display is in close proximity, I don’t think there needs to be an alternative to swiping per se though the current way this issue is addressed in Windows 8 is definitely inelegant.

Touch targets are far larger than those for pixel precise navigation, which should yield noticable improvements in speed and efficiency. If anything, using a mouse or trackpad on a touch interface for basic navigation should actually be more efficient compared to using a mouse on a UI designed for pixel precision (per GOMS: http://www.cs.umd.edu/class/fall2002/cmsc838s/tichi/printer/goms.html). This obviously does not include behavior like swiping, which is touch specific.

“Communication, more than the file system per se, is the problem and it is another major impediment to using iOS as a universal OS.” – Steve Wildstrom

This is one area that Android is clearly superior to iOS and even Windows. For the most part, at least on Android, it is a non-issue.

“There is far more to the UI than the screens of icons. The Cocoa Touch APIs (and whatever their equivalent in Android is called) provide a rich set of tools that developers use to create the controls for their apps.All of these are optimized for touch, not pixel-precise controls.” – Steve Wildstrom

When you look at what technologies are missing in a tablet as compared to a PC, we are discussing comparatively little, especially if we are confining it to technology strictly related to the client side. For all intents and purposes, tablet and smartphone UIs are “10 ft” interfaces while Windows was originally designed when screen real estate was at a premium. The major difficulty is in reimagining Windows for a world in which screen real estate is abundant.

The reality is that there is no inherent technical limitation that prevents the creation of an OS than can be used effectively on both tablets and PCs. It wouldn’t even require a major overhaul, the current metaphors can be modified to accomplish the same end:

While this is only a concept, this gives an example of how current Windows paradigms can be unified in a way that would translate well to both tablets and PCs. It even works pretty well with a mouse.

I don’t agree with John that touch is strictly for tablets, keyboards are strictly for PCs and an OS can’t be designed that blends the two effectively. We aren’t dealing with a technical impossibility.

That being stated, I’d be surprised if such an OS were actually ever created, especially by Microsoft. Julie Larson-Green is credited with creating the Windows 8 UI but, based on her history, she has no bona fides as a UI designer. I think UI designers in Microsoft are second-class citizens and engineers are the stars of the show. I think Microsoft’s culture is what is preventing a major improvement in the Windows UI, not any technical limitations.

Space Gorilla

I use my iPad with a hardware keyboard all the time. It’s not awkward at all, it’s great, quick, comfortable, I prefer editing/writing this way now.

FalKirk

Metaphors, like women’s breasts, make men stupid when they come in pairs.

jfutral

o_O

Joe

James King

😉

Neil Anderson

Our local bar held a wet T-shirt contest, and one woman won first and third.

If you cannot answer a man’s argument, all it not lost; you can still call him vile names. ~ Elbert Hubbard

JoeS54

That wasn’t name calling…it was a summary of your article. And frankly, every article of yours I’ve ever read. Look at it as an attempt to inspire you to branch out from: “Apple is the greatest company ever. Everything they do is right and good. One day soon Apple will rule the entire world!”. Because that’s what every one of your articles ends up being. Your writing is not bad, the problem is that you have nothing to say except the aforementioned Apple rooting. I challenge you to write an article critical of Apple. Surely even in your view there must be something to criticize about them.

Kizedek

Oh, Apple isn’t the greatest? 😉

Seriously, I think John did a good job of explaining why the iPad strikes a resonance with people. And it does sell some 20M per quarter, so Apple must have gotten something right with it.

On the other hand, since you have a couple of cool Surface anecdotes, and since I didn’t realize it was merely poor marketing and poor timing that utterly sank the Surface and initiated a write-down, I think I had better totally re-evaluate MS as a candidate for greatest company ever.

Maybe. But if I learned anything from Usenet, it was that one person may comment but hundreds or thousands may read said comment. Most counter commentary should be made with an eye to not leaving ridiculous assertions unchallenged. Even if you have no hope of changing a fool’s mind (or in the case of a troll where to point is decidedly not to inform or make a point) dumb ideas and erroneous claims should be seen to be challenged.

jfutral

I’ll call your “maybe”. Maybe, but one thing I learned from usenet is most of those comments, in and of themselves, never really changed anyone’s mind and ridiculous assertions really do speak for themselves. If someone doesn’t recognize it as ridiculous, no amount of countering (no matter how well reasoned) will alter that view. Trying to counter the irrational with the rational is having two completely different conversations.

Alternatively, I think John’s article (as the instigator) already challenges the dumb and erroneous response. Anything more is just beating a dead horse.

IMHO, Joe

Nangka

How can that be when the 1st instance of “Apple” I searched is by rationalchrist in the comment.

You’d be more right if you said John is anti-MS.

Kizedek

I think it is the buyer of the Surface that generally ends up crying.

JoeS54

I got a Surface Pro 2 in November, and it’s the best technology purchase I’ve made in years. Everyone I speak to or read who has one feels the same way. There even appear to be quite a few people who really like the regular (RT) Surface 2. Microsoft’s biggest problem at this point is marketing, and being late the game. The designs need ongoing improvement, but they’ve got a direction that works.

Nangka

Good for you! But not for MS with the 5 of you using Surfaces.

And it doesn’t look like you, along with MS, understand the crux of what John is getting at in this article.

rationalchrist

Technical speaking, the failure of Microsoft lay in 2000 with Window NT. A monolith spaghetti descendant of useless DEC VAX. Since then, Microsoft cannot adapt the underlay OS to other platforms than x86 and reuse the code bases. Instead Microsoft opts to superficial unification of OS system: Win98 vs. WinNT, Win CE vs. WinXP, Win Phone vs. Win whatever, Win Surface vs Win 7. They even call it Surface, true in the hilarious way. All in the surface. They are all called Window, but all different. UI slow, lag, lack of features, no app, etc because each system has to start from scratch. Nothing learnt from other systems can be applied, and no code reuse. MS can afford three or four iterations to catch up in PC era, but not in mobile internet era. Man, it is too fast. Instead, Apple and Android go the route of Unix, modular componentized agile system. They unify the underlay OS across platform, but offer different presentation layer easily. Whatever next big thing comes, they morph to it. Look at how quick Android morph from the copycat of Blackberry to iOS. And HTC, Samsung put their surface on the top of it. Nimble. And revenge of the Unix. Every time I looked at Bill Gates on WinNT, I get a kick of it.

klahanas

It may have Unix roots, the very notion of having the manufacturer as a non-owner sysadmin is Unix anathema. I do agree with the other parts of your comment though.

Thorntondw

I don’t think there is anyone from Bill Gates down at Microsoft that would understand a metaphor it they were hit over the head with one. They are all geeks and don’t understand metaphors.

FalKirk

Are you saying that Bill Gates and Microsoft never met-a-phor they understood? 😉

Thorntondw

Actually they think a metaphor is a bit of metadata

jfutral

You guys are so meta.

Joe

Thorntondw

I do believe that by merging the functionality of Android into the Chrome OS, Google would have a real Windows killer. Unfortunately Google has shown no ability to “consumerize” its innovations. They always leave that to others.

FalKirk

“I do believe that by merging the functionality of Android into the Chrome OS, Google would have a real Windows killer.” – Thorntondw

I think the opposite. Android and Chrome are incompatible and Google knows it.