"Desktop computing technology has evolved considerably since the first graphical user interface was developed by researchers at Xerox's Palo Alto Research Center in 1973. Microsoft's Aero Glass, Apple's Quartz 2D Extreme, and Sun's Project Looking Glass are all poised to transform the way that users interact with computers. Also at the forefront of innovation, the Linux community has some prodigiously impressive new user interface technologies of its own. Beryl, a new open-source window manager for Linux, features compelling visual enhancements like support for transparent windows and elaborate window animations."

Yet I failed to make it run it at even remotely usable speeds on an n6200 chipped 128 mb turbocache pcix card. Yes, I know about the glx problems with turbocache cards, but this knowledge doesn't make it work :/

"It runs fine in a live environment on my 5 year old AMD 1.3GHz Thunderbird with 256MB RAM with a GeForce II Ultra graphics card (also 5 years old)"

We all know how fast technology evolves in the IT world, but 'old' is still a relative term. People don't have a clue how fast an AMD (or Intel) 1.3GHz CPU really is, what an incredible amount of 'shit' it eats trough each second. And some call that slow...

Heck, I programmed a real-time software oscilloscope once on a MSX2 computer. That is: read audio samples, calculate location in video memory, talk to videochip to set single pixels on the screen. Several thousand times a second. That was on a 8-bit, 3.58 MegaHz CPU, using just KiloBytes for code and data.

An AMD 1.3GHz CPU is thousands of times faster, and 256MB of RAM... remember that's really just for code, and variables. For background storage there's harddisks.

If *any* user interface doesn't run silky smooth on such a setup, then that software is crap, not the machine.

These days, it seems a dualcore multi-GHz CPU and GB's of RAM is nothing... Bah, I read: the art of programming has somehow been lost along the way....