The End has been Nigh For Some Time

Justin Rattner over at ZDNet recently posted a blog about the end of applications. Mr. Rattner is right, yet incredibly wrong. He is absolutely right in that "all of the interesting applications have been written." In fact, I am on the record as stating that it is virtually impossible to create a single new application that could not be done (however inefficiently) with other means and tools.

That being said, Mr. Rattner's analysis of where the future lies is misguided at best. He believes that the future lies in changes to reduced operating costs and increasing manageability. I am all for reducing operating costs and increased manageability. But usability is the most important missing piece. Computers are completely unusable. Even cell phones are unusable. Most cell phone consumers pay big bucks for phone where the only features they use are the same features they were using 10 years ago, making phone calls. Why? Because the interface and OS for cell phones is atrocious.

Mr. Rattner's viewpoint is that of a hardware person. This is understandable; he is the CTO of Intel (amongst other technical titles). Hardware people have never understood end users terribly well. Not that application developers do such a great job of it either, but at least they make the product that the customer is actually using. Even worse, Intel specifically understands neither application development nor end users. They make general purpose CPUs designed to disappear in the user's life. Intel has never once done anything to help me as a developer. When Intel started pushing HyperThreading and dual core CPUs, did they put anything out to help tech the average programmer how to write programs to perform well on single core and dual core CPUs? No. Did they put any effort into helping Open Source Software projects optimize their performance to take advantage of the new hardware? No. Has Intel ever put any effort at all into writing code that end users see? Nothing except for a few drivers and system tray applications for their NICs and video chipsets. So why would I think that anyone at Intel has any understanding of the developers, the development process, or what end users want?

If you are an IT manager, would you prefer to go through yet another round of hardware purchasing to consolidate a few servers and reduce the power bill... or would you prefer to initiate a usability improvement session on your existing software? Most IT managers will choose to buy the hardware. I do not care how much hardware or how powerful it is. If the users cannot use it, it is wasted money. It's like owning a Corvette but driving the Long Island Expressway to work, it is wasted power. On the other hand, increased usability, by making interface (and some underlying logic changes) yields rich results with the hardware and applications we have today. For example, a website that performs a round of usability improvements can expect to see sales increase by a dramatic amount .

We have been playing Intel's game for the last 20 years, and it has not helped the end users. IT does not exist for the sake of enriching the wallets of hardware vendors, application developers, or consultants. Let me rephrase that. IT should not exist for those reasons. But it does. As a result, users hate their computers. Companies spend a fortune on IT, buying hardware to compensate for sloppy programmers and software that is "the version you have been waiting for your whole life" that still does not address the underlying flaws in all of the previous versions, but has a few new featured buried in a menu someone, and it now lets you skin the interface.

I have been a computer user for the last 20 years. I have owned a personal computer for at least 15 of those years. Yet, at the end of the day, my usage habits have been frozen since about 1994. Here is what I use a computer for (in no particular order):* Checking email* Chat/instant messaging* Consuming content, primarily text, but sometimes audio/visual* Creating content (95% text, 5% visual)* Programming* Playing video games* SSH'ing to *Nix servers

I was able to do all of this, in one form or another, on a 486. I was able to do most of it on a 286. Indeed, my 286 with that 2400 baud modem was actually able to feed me remote text (my primary target of consumption) about as fast as my current PC does over a broadband connection. Why? Because it was not dealing with a Web server or having to parse HTML, and then fire up a JavaScript interpreter just to make a menu pop up!

And sad to say, the actual usability of the computer has gotten worse, not better.

I postulate that while achieving a minimum level of proficiency with a computer is now much easier, the difficulty and time needed to become a power user has increased substantially. For example, learning the basics of WordPerfect 5.1 was a bear. But once you had that little template laid over your F keys, you could do anything you wanted to a document with a few short keystrokes, without ever taking your hand off of the keyboard. Compare this to Word 2003. Word does not have many word processing features that WP5.1 did not have. Its extra features are on the periphery, in the form of things like Smart Tags, the "Research" system, and so on. Yet, doing the same tasks that I did in WP5.1 in fractions of a second now take many seconds! Take my hands off the keyboard, reach for the mouse, highlight the text, go to the menu (because there are more items than keyboard combinations), scan through the endless cascading menus looking for the tasks I want to perform (instead of glancing at the template laid over the F keys), sometimes taking a minute, then walking through some dialog or wizard. All to create the exact same table which took under 10 keystrokes and five seconds in WP5.1.

If anyone considers this "progress" they are mistaken.

Mr. Rattner is correct in that the vast majority of "improvements" really fall under the eye candy category. RSS is eye candy. HTML is eye candy. AJAX and "Web 2.0" are irrelevant eye candy. Granted, my definition of "eye candy" is rather broad; much of what these technologies deliver is added convenience and even the occasional efficiency to the end user. But the value of the content or the application itself is not increased by much at all. None of these technologies truly improve the human/computer interaction. There is nothing that these technologies deliver that could not be done 20 years ago. But the hardware vendors love these technologies. Want to sell faster CPUs? Get programmers to ditch flat files for tabular data in favor of XML. Want to sell memory? Get the users onto AJAX'ed systems ("JavaScript is the COBOL of the Web" to quote Roger Ramjet from ZDNet TalkBacks) and off of natively compiled code. Pushing horribly cluttered interfaces is a great way to get people to buy gigantic monitors and the video cards that go along with those higher resolutions. Convincing people that it somehow makes more sense to watch a movie on a 19" monitor instead of their 35" TV is another smart move.

So while I think that Mr. Rattner's starting premise is right, I think that his analysis of the situation in general is not correct.