The X Window System Must Die!

I have always been exasperated that Unix systems
chose to standardize on the X Window System. X was a decent solution in the days
when computers used low-color unaccelerated frame-buffers for graphics, and needed to communicate over low-bandwidth links; but X's time is long past. It survives mainly because no one is willing to throw it away -- there's too much software now that depends on it.

Page 1 of 1

I happen to believe that efficiency and clarity in software is a good
thing in its own right -- many benefits are conferred to software simply
as side-effects. And usually, "efficient" is synonymous with "small".

Most of the great software-engineering gurus
throughout the years (people like Fred Brooks, Rob Pike, Ken Thompson, and
Dennis Ritchie) would agree with the above statement. Yet the history of
software is one of insidious bloat. Systems that started out lean and fast
(Unix's original kernel was comprised of about 10,000 lines of code!) grow
into lumbering monstrosities with codebases so frightfully complex that
maintenance is almost impossible. It's probably a true statement that no
useful program ever got *smaller* over time -- programmers are notorious
for sticking in new features rather than fixing old bugs.

I had occasion to think of this issue recently when
a reader of one of my previous essays wrote to me and rhapsodized about
how efficient Linux is. I heard much the same thing from proponents of
FreeBSD.

Well, I hate to burst your bubble, folks: Linux
(and even the FreeBSD to a lesser extent) is just as overweight as most
other operating systems these days. The kernel alone (when compiled), can
take up anywhere from 700K to 1.2MB, depending on the configuration and
whether drivers are compiled in. The GNU C library is likewise huge,
nearly twice the size of the comparable BSD C library.

But the most egregious offender in this area, the
reigning champ of bloat, is the X Window System.

X Must Die!

I have always been exasperated that Unix systems
chose to standardize on the X Window System. X was a decent solution in the days
when computers used low-color unaccelerated frame-buffers for graphics,
and needed to communicate over low-bandwidth links; but X's time is long
past. It survives mainly because no one is willing to throw it away --
there's too much software now that depends on it.

X11, even with the yeoman work done by the
XFree group, is a huge rat's nest of code which can't compete with most
other graphics systems performance-wise. It can only use vector fonts with
great difficulty, and anti-aliased fonts (which have been staples of both
the Mac and Windows for years) is still a dream on X.

But X cannot live on its own: it needs a Window
Manager and a widget toolkit to do anything. More millions of lines of
code, layered on top of an already overweight system. GNOME, KDE, even
Motif exact a huge performance hit and add further complexity. Worse yet,
no one in the Unix community seems to be able to agree on any kind of GUI
standard, so you have an explosion of window managers, desktops, file
managers, and widget toolkits which are mostly inoperable with each other.

I had high hopes for GnuStep, an Open Source port
of NeXT's OpenStep environment; it built on the excellent and coherent
NeXT interface. However, it seems like this project (like Gnu's HURD
kernel) is stuck in eternal limbo. GNOME and KDE are now fighting for
primacy and neither GUI really addresses X's weaknesses.

It is this problem more than any other that makes
using Unix and unix-clones a trial for many people, and it is why I
hesitate to write software for a platform I consider both elderly and
unreliable. I contend that it is impossible to write a truly robust X
application because X itself is such a gruesome hack job.

Kill X. Now. No more Band-Aids, redesigns, or
updates. Port GTK and QT (the toolkits used by GNOME and KDE,
respectively) to a new graphics architecture. It's long past time.

Is It Impossible?

Here's a challenge for the Linuxcommunity. Consider
it a gauntlet thrown down.

Pack the following elements onto a *single* 1.44MB
floppy disk: a current kernel (with TCP/IP stack), PPP *or* network card
driver(s), a basic X Window setup (*with* a window manager), a graphical
text editor, a lightweight web browser, and a basic shell (ash would be
fine). Feel free to use whatever means of magic necessary to pack
everything onto a single floppy disk.

What do you want to bet that no one takes me up on
this?

"That's impossible!" some might say. Not so. QNX
has done it; you can find their demo floppy at
http://www.qnx.com/demodisk/. The first time I tried it, I was
flabbergasted; they somehow managed to shoehorn not only the core OS and
windowing system onto a floppy, but also a small web browser! (Dan
Hildebrand, the genius behind this effort, tragically died of cancer in
1998. He was a truly godlike hacker and will be sorely missed.)

I'd never realized how *huge* Linux was until I saw
the QNX demo. And I don't mean "huge" in terms of the number of packages
installed; I mean *huge* in the sense that Linux's kernel takes up almost
as much space as the *entire QNX operating environment*! Linus may not
like microkernel architectures, but the QNX demo pretty decisively shows
the benefits of a small and modular approach versus the monolithic one.
The Photon MicroGUI is not only vastly smaller than X, it also does vector
fonts and anti-aliasing, which is beyond X's capacity at present.
Further, it has native Unicode support, which X lacks.

The R&D Edge

The Open Source software library (Linux, the
*BSD's, and so on) is mainly composed of *reactive* technologies -- they
adopt what has already been done in the industry, and duplicate it in free
software. This free software is occasionally enhanced, but usually the
Open Source product is a workalike (with bug-fixes) rather than an
innovation.

A weakness of the Open Source model is that it is
very difficult to do Research and Development. It's not surprising that
Linux has had such a hard time getting a modern desktop GUI; until fairly
recently, there was no coordinated effort to produce one. And these GUIs
(GNOME and KDE) are mainly copies of Windows; they introduce very little
that is new.

Apple, on the other hand, has truly extended the
GUI with their upcoming Mac OS X. The graphics system contains a plethora
of new stuff: a PDF-based 2-D imaging model, accelerated OpenGL, and the
"Aqua" GUI layer. Love Aqua or hate it (Mac folks seem divided on the
issue), you can't deny that it's interesting: its use of icons, colors,
animation, and alpha-transparency are pretty novel (although not truly
new). It took the Linux world about ten minutes to whip up a "skin" for
the popular window managers to make them look like Aqua. Much the same
thing happened with the NeXTStep look-and-feel.

It's interesting to note that no one in the Open
Source community *invented* any GUI of note; they simply built on someone
else's R&D. FVWM was a pretty straightforward rip of the Motif Window
Manager. AfterStep and WindowMaker are copies of NeXTStep. IceWM is a copy
of Windows 95/98. GNOME and KDE are also Windows look-alikes. There are
copies of Open Look (invented by Sun), the Mac, the Amiga, and so on, but
nothing really original. Enlightenment comes the closest to an "original"
GUI, and it's an utter train-wreck from a usability standpoint.

The phrase "user testing" is unknown to Open Source
coders, who will pick gee-whiz graphics over usability any day. Apple has
spent millions over the years in focus groups, user meetings, and
biometrics studies to figure out exactly how GUIs should look and behave.
They don't always succeed (just look at the QuickTime Player 4 for a lousy
interface), but in many ways the Mac is still the gold standard for GUIs.

This tendency of Open Source programmers to copy or
re-engineer rather than invent is a good thing overall (it saves a lot of
wasted effort and traveling down blind alleys), but it does have a
downside: the codebase of such programs, unless carefully controlled,
rapidly degenerate into huge piles of spaghetti code.

Conclusion

Unix in all its incarnations is probably a lost
cause in this sense. It's extremely unlikely that X will be replaced
anytime soon. Oh, sure, X will (hopefully) gain features like
anti-aliasing and native handling of vector fonts. The codebase will
become ever more complicated and hard to maintain.

I frankly don't see much light at the end of the
tunnel. X is too deeply embedded in the Unix world to be easily dislodged,
and the lack of a GUI standard on the platform doesn't appear likely to be
resolved. Maybe the embedded space can offer some salvation -- programmers
will *have* to forgo X to run on smaller devices, and perhaps this will be
the wedge that gets X out of all our lives.

Let's hope so.

Postscript

For an interesting take on how and why most X
window managers suck, check out
Asktog.com First Principles. It was written by Bruce
"Tog" Tognazzini, a UI guru of large repute. It's an eye-opening essay,
and should be read by more UI designers.

Monty Manley is 33 years old and has been
programming for more than half his
life, on everything from a Vic 20 to an IBM 390
mainframe. He forms part of the
hated power elite of white males who hold down
programming jobs. He currently
lives in Minnesota with his wife, two cats, two
rats, and a mole who decided to
make a home in his front yard.