The High Cost Of Cool

New bells and whistles on your favorite technology are exciting, but sometimes they're just noise.

The pace of evolution in modern computing, which can trace its roots back to the 1940s, has slowed in most dimensions in recent years. Sure, we've got solid-state drives, multi-core processors, DDR3 DRAM, gigabit Ethernet, and more, but the fundamental nature of the computer hasn't changed much at all in decades--and that's a good thing.

In the 1970s we debated all manner of computer, network, and multiprocessor architectures; everything was possible then. But we've now converged to a core set of IT intrinsics, and the computer itself is no longer the hallmark of innovation it once was. And that's good because IT buying is now mostly risk-free.

Except in one key dimension: user interface. We've gone from punch cards to teletypes to smart terminals to WIMP (windows/icons/mouse/pull-down menus) to--with tablets, handsets, and some notebooks--gestures and voice. In some cases the transition from one UI model to the next has been easy, but rapid evolution, especially in the last few years, is becoming increasingly difficult for many to swallow--and, quite frankly, most of this change is totally unnecessary.

Why? Because much (if not most) of the training and support costs that IT departments must bear on a continuing basis are with the goal of simply helping the user to be effective and productive with a given device. Making routine tasks easy and sure. Enabling secure, transparent, and accurate data management.

All too often, however, these important goals are secondary to those of a cool new user interface. And the goal of that cool new interface, then? Product differentiation--providing an incentive to buy a cool new product, and, all too often, cool for cool's sake. Think iPad and you'll see what I mean. Sure, the iPad, as we saw with Apple's recent announcement of a product so cool it needs only one name (more confusion afoot there, I think), improves in the hardware domain with each new edition, but it's still a big iPod Touch and often inconvenient for even simple enterprise data-manipulation activities. I never thought iTunes would be a business necessity, and I still don't think it should be.

I can't tell you how many IT shops I've visited that are still using that clunky, slow, and soon-to-be-unsupported Windows XP. Why? Well, apart from all that clunkiness and slowness, it works. But most importantly, XP retains market share because of latent pushback from the Windows Vista fiasco and fundamental user familiarity--and thus productivity. Why change the user interface of a given operating system or device just to do the same tasks as before, only differently?

The only real benefit here accrues to the suppliers of that coolness, who are, after all, in it for the money and need to continue to sell new stuff, needed or not, to keep the cash tumbling in. IT organizations get stuck with new training and support costs that they really can't afford, and overall productivity is impacted as users learn new ways to either do what they did before, or screw up in the process. It's bad enough that underlying implementation details and features of new operating systems change, forcing IT to re-evaluate, with each new release, such subtleties as reliability, integrity, application compatibility, and security, but forcing users to change for change's sake is simply going too far.

Please note that I'm not arguing against progress. I know there are still users of WordStar on CP/M out there, and that's by no means what I'm advocating. If there are real, demonstrable benefits to new user interfaces, and the cost of these can be successful amortized, then, well, great--let's have them. But my personal appeal is for a little less "progress" here.

I switched to a Mac around the time of the aforementioned Vista fiasco, and, while Apple has issued numerous updates to OS X over the five or so intervening years, the essential integrity of the user experience remains intact. Do I like the Mac UI? No, not particularly--but it gets the job done, and I'm satisfied that security, integrity, and other requirements are being properly addressed in our IT operations. And that, and not coolness, must be the bottom line for any enterprise.

Craig Mathias is a Principal with Farpoint Group, a wireless and mobile advisory firm based in Ashland, MA. Craig is an internationally recognized expert on wireless communications and mobile computing technologies. He is a well-known industry analyst and frequent speaker at industry conferences and trade shows.

Predictive IT analytics can provide invaluable insight--vital if a private cloud is in your future. Find out how in the new, all-digital Predictive IT Analytics issue of InformationWeek. Also in this issue: Randy Mott named CIO of General Motors, how Dell is pushing into the enterprise data center, and eight key features in Windows 8. (Free registration required.)

Craig - changing the UI has to do with evolution behind the theory of how people work and an attempt to have them working more efficiently. People buy things based on how they look - if the design is appealing, they'll buy it, even if it's the most cumbersome device/program/product in the world to use (I have to wonder if this is the psychology behind high heel shoes, but I digress). I would also have to say that Vista can't necessarily be considered a failure based simply on the redesign of the UI - there were a lot of other issues at play, and the lack of end-user familiarity with the UI exacerbated the "I hate Vista" sentiments. I just hope that Microsoft doesn't have a repeat of the Vista event with Windows 8's Metro interface.

Sam - IT shops are using Windows because it's prevalent and that's what the vast majority of the end-user application base is written for. While most experienced IT professionals would know that you can get the same functionality out of OpenOffice on a Linux system as you can with Microsoft Office on a Windows system - try explaining that to an end user with a limited knowledge and/or fear of the system that they're forced to work with on a daily basis. Sure, it can and does function either the same or in a very similar way, change is seen as a bad thing. With regards to your comments as to Linux costing less, I would have to point you to the various RoI white papers/calculations out there. Switching an entire company (much less enterprise scale organization) from Windows to Linux introduces a number of costly issues - who's going to train the end users? Who's going to support them? Who's going to train the support team? Who's budget gets impacted due to a loss/reduction of productivity by the end user who suddenly sees their productivity drop off dramatically due to their inability to use the new system? There are a LOT more factors involved in enterprise OS selection that simply licensing fees.

Aren't IT shops only using Windows because it used to be the shiny, cool product in the 90s when they standardized on a PC OS? I agree that UI is not important for user productivity, but then why not use Red Hat EL or some other no cost Linux distro as a desktop OS instead of Windows? It provides a perfectly navigable UI and is much less costly than Windows. Most people do most of their enterprise application work in the browser anyway.