In 2004 I already saw the writing on the wall. PC technology started commoditizing, and at a brisk clip as well.

When I got my first K6 processor, it was not just the overclockers that cranked up that 66 MHz front-side bus. Anyone who bought a computer had to know some of the essential differences between MMX and 3D-Now!. A lack of knowledge, in those days, would set the up and coming computer user back hundreds of dollars if he wasn't careful.

Whether you agree with Karl Marx or Karl Rove (or anyone in between), a telltale sign of commodification occurs when the manufacturer stops focusing on tangible aspects of the product and starts pushing less tangible selling points. This often occurs when competing products are too similar, or at least indistinguishable from the purchasers point of view.

Where have we seen this before? Well, my HTC Hermes did everything the iPhone did a year beforehand, but I'm pretty sure Apple sold a whole lot more iPhones. Look at today's motherboards: any manufacturer would tell you its all-solid capacitors are better than the next guy. And don't even get me started on the memory industry ...

I remember the exact instant when computer hardware became a commodity. Steve Jobs got up in front of one hundred journalists and in less than 60 seconds, a million Apple zealots went from ardent Intel naysayers to hardened Intel devotees. In that moment I realized it didn't really much matter to anyone which CPU was better than another, it only mattered what Steve Jobs told everyone to think anyway.

Other signs of the death of the PC enthusiast are littered across the Internet like the tattered remains of a kite breaking up on rentry. The birth and demise of AMD's Quadfather, the ubiquitous lack of support (or interest) for quad-GPU graphics, failed physics processors and inconsequential sales of "killer" network cards.

In a recent conversation with Jon Stokes, both of us agreed that while PC tech has seen some great growth over the last few years, this growth is not keeping pace with the Internet as a whole. PC technology, as a journalistic discipline, is unfortunately niched to the degree you'd find with muscle cars.

This leads me to answer the question I started out with: the PC industry, as a whole, just isn't as fast-moving or interesting anymore. Attempting to debate the merits of largely intangible technology topics is a discussion more akin to politics than science.

You bet I'm excited about CPU-GPU integration and new OLED technology, but another unfathomably high frequency bump in the sea of JEDEC memory timings completely fails to pique my interest. Analysis of Google Keywords would indicate those more mundane markers of progress in the PC industry fail to grasp even the smallest of demographics on the Internet as well.

That does not discount the importance of the tech enthusiast. Those of us who grew up debating the merits of CPU architecture in the 1990's are the pioneers in virtual discussion. We are what the majority of consumers will become over the next decade when new, broader forums come to be.

Don't worry, I'm still the first person in Taiwan with Intel's next-generation roadmap. However, as this industry withers and new ones blossom, I encourage you all as pioneers and enthusiasts to look beyond the chips and bits once in a while.