What hardware do you use?

Three monitors: two 24-inch Dell Ultrasharp 2407WFP monitors (rotated 90 degrees so they are "tallscreens" @ 1200 x 1920) on either side of a 27-inch [Samsung 275T+][syncmaster-275t-plus] @ 1920 x 1200.

Emacs 23 full-screen on the center (widescreen) monitor. I have relatively few Emacs customizations, but the big one is ido-mode, which I just discovered this year (thanks Emacs subreddit).

Nightly build of Chromium full-screen on the right-hand (tallscreen) monitor.

That's it, really. There's other stuff that got installed by default, but I'm not attached to it, I rarely use it, and I don't remember what it's called. Most of my daily "applications" are really web pages.

Gmail (actually Google Apps For Your Domain) for my personal email. I have a script that runs once a day that backs up my mail to maildirs on my local hard drive.

Google Reader for keeping up with feeds. I formerly used a custom script to publish my own mini-"planet" site, but I've since moved to Google Reader because it has integrated recommendations for when I want to expand my subscription list.

Pandora for most of my music listening, and a variety of random Internet radio stations for the rest. I have a Logitech Squeezebox in my dining room hooked up to my Pandora account, and I use it to listen to music during dinner.

Google Docs for my (very light) spreadsheet needs. Mostly Christmas lists that my wife and I collaborate on once a year.

There are no CDs or DVDs anywhere in my house, except some originals in the attic. My kids watch movies on an AppleTV set-top box and play games on a soft-modded Wii. All the Wii games we've bought have been "ripped" to an external hard drive, and the Wii boots into a custom launcher. I listen to digital music on my computer, via the network-enabled Squeezebox in the dining room, or on an old iPod in the living room that I hooked up to a Klipsch iGroove.

I'm a three-time (soon to be four-time) published author. When aspiring authors learn this, they invariably ask what word processor I use. It doesn't fucking matter! I happen to write in Emacs. I also code in Emacs, which is a nice bonus. Other people write and code in vi. Other people write in Microsoft Word and code in TextMate+ or TextEdit or some fancy web-based collaborative editor like EtherPad or Google Wave. Whatever. Picking the right text editor will not make you a better writer. Writing will make you a better writer. Writing, and editing, and publishing, and listening -- really listening -- to what people say about your writing. This is the golden age for aspiring writers. We have a worldwide communications and distribution network where you can publish anything you want and -- if you can manage to get anybody's attention -- get near-instant feedback. Writers just 20 years ago would have killed for that kind of feedback loop. Killed! And you're asking me what word processor I use? Just fucking write, then publish, then write some more. One day your writing will get featured on a site like Reddit and you'll go from 5 readers to 5000 in a matter of hours, and they'll all tell you how much your writing sucks. And most of them will be right! Learn how to respond to constructive criticism and filter out the trolls, and you can write the next great American novel in edlin.

People then ask what format I use to write my books. That's a more interesting question. It's still completely orthogonal to becoming a better writer, but I'm a text wonk, so I'll tell you. I wrote the original Dive Into Python in DocBook XML. Then Dive Into Greasemonkey, then documentation for Universal Feed Parser, all in DocBook. I self-published "Dive Into Python" in HTML, PDF, Word, and plain text. For years, there they sat, a list of downloads in different formats. Then I looked at my logs and realized that very few people ever downloaded it at all, and those that did mostly downloaded the HTML version. This was an epiphany. I publish my work in HTML, people primarily read my work in HTML, so it makes sense to write in HTML too. Writing in one format and converting it to HTML is not worth the mental and technical overhead. HTML is not just one output format among many; it is the format of our age. This epiphany was one of the reasons I got involved in HTML5.

Dive Into Python 3 was my first major work that I wrote entirely in HTML. (I had to convert the entire book to Microsoft Word format as part of the print publication process. That was... unpleasant.) My next book, Dive Into HTML5, is also written in HTML, and my editor tells me that they will handle the nasty business of converting it into suitable formats for print. This may become a factor in choosing a publisher for future books: the ability to avoid Microsoft Word altogether.

What would be your dream setup?

I have an Apple IIe in my attic. My parents bought it in 1984. We used it exclusively for five years; I wrote my first program on it, I wrote my first poem on it, my mother ran her first business on it. We sold it to a family friend in 1989, and she used it as her primary computer for 10 more years, until 1999. A few years ago, I paid her to ship it back to me. The damn thing still works -- color monitor, 80-column card, original disk drives, everything. Most of my 25-year-old 5.25-inch floppy disks still work. Of course there's no software being written for it anymore (except Silvern Castle, God bless you), but what it could do in 1984, it can still do just as well in 2009.

I've had my current desktop for a little over two years. I want to continue using it for another 20. I mean that literally: this computer, this keyboard, this mouse, these three monitors. 20 years. There's no technical reason the hardware can't last that long, so it's a matter of whether there will be useful software to run on it. First, there's the operating system. People throw away computers every day because they're "too slow" to run the latest version of their preferred operating system. Linux (and open source in general) is not immune to this, but I think it's more immune than proprietary operating systems. Debian only recently dropped official support for Motorola 68K machines; that's stuff like the Mac IIci that I bought off the clearance rack at Microcenter in 1992. The latest version of Debian still runs on my old PowerPC "G4" Apple laptop, even though the latest version of Apple's operating system doesn't. Commercial vendors have a vested interest in upgrading you to the latest and greatest; supporting the old stuff is unglamorous and expensive. Commercial open source vendors aren't really much better than commercial proprietary vendors in this regard, but community-led Linux distributions can afford to have different priorities.

Next in the software stack is drivers. Everything from the network card to the graphics card to the sound card needs a working driver. Linux has the most comprehensive driver support of any operating system, ever. Yes, I'm including Windows in that statement. People think Linux driver support sucks because newer hardware sometimes only works with proprietary Windows drivers. That's true, but there's a lot more old hardware in the world than new hardware, and Linux has superior support for older hardware because the community writes and maintains their own drivers. People throw away computer accessories every day because they upgrade their operating system and can't find functioning drivers. (Will that scanner you bought in 1999 still work on your shiny new 64-bit Windows 7 machine? I wouldn't bet on it.) All of my hardware is supported today by open source drivers, which removes one of the primary reasons that people throw away working hardware. Again, I'm not saying Linux never drops support for older hardware, but the cycle is longer and the incentives are different.

Next up is applications. Open source has the clear advantage here, because communities can recompile and redistribute other people's software for multiple platforms. I'm currently running a 64-bit operating system on 64-bit hardware. With few exceptions, all of the software I run can also be recompiled to run on 32-bit operating systems. This is so common now that we take it for granted, but it's really quite remarkable. No doubt there will soon be 128-bit hardware available at reasonable prices, and then other advances after that. And Linux distributions will take advantage of newer hardware, but they can also continue supporting older hardware for much longer than proprietary operating system vendors, which rely on individual developers to support each platform. So if there's an operating system that still runs on my hardware 20 years from now, I'm pretty sure I'll be able to run Emacs on top of it.

Where my 20-year plan will most likely fail is not at the operating system or driver level, nor with the existing crop of applications. At some point we will invent an entirely new class of application, like the web browser was an entirely new class of application 20 years ago. This new class of application will naturally be targeted at the "current" hardware of the day, and nobody will bother to backport it to the hardware I have now. Chromium is actually a good example of this, only shifted a few years. It contains a dynamic JavaScript compiler (V8) which requires explicit support for each hardware architecture. There is no Chromium for PowerPC, even though it's open source, because a central piece of the application only works on x86 and AMD64 architectures. There's nothing stopping anyone from writing a PowerPC version of V8, but it's unlikely to happen unless some super-genius hobbyist decides to take it on. And browsing the web and using web-based "applications" accounts for 90% of the time I spend in front of a computer. (Writing doesn't actually take that long. It's the long stretches of procrastinating that take up most of your time.) So it's a safe bet that in the next 20 years, there will be an entirely new class of application that doesn't exist now, and I'll want to use it, and my hardware will be so far behind the curve that none of those new applications will support it. Then I'll have to upgrade.

But hey, you asked for my dream setup. That's it: one computer for 20 years.