In 2004 I already saw the writing on the wall. PC technology started commoditizing, and at a brisk clip as well.

When I got my first K6 processor, it was not just the overclockers that cranked up that 66 MHz front-side bus. Anyone who bought a computer had to know some of the essential differences between MMX and 3D-Now!. A lack of knowledge, in those days, would set the up and coming computer user back hundreds of dollars if he wasn't careful.

Whether you agree with Karl Marx or Karl Rove (or anyone in between), a telltale sign of commodification occurs when the manufacturer stops focusing on tangible aspects of the product and starts pushing less tangible selling points. This often occurs when competing products are too similar, or at least indistinguishable from the purchasers point of view.

Where have we seen this before? Well, my HTC Hermes did everything the iPhone did a year beforehand, but I'm pretty sure Apple sold a whole lot more iPhones. Look at today's motherboards: any manufacturer would tell you its all-solid capacitors are better than the next guy. And don't even get me started on the memory industry ...

I remember the exact instant when computer hardware became a commodity. Steve Jobs got up in front of one hundred journalists and in less than 60 seconds, a million Apple zealots went from ardent Intel naysayers to hardened Intel devotees. In that moment I realized it didn't really much matter to anyone which CPU was better than another, it only mattered what Steve Jobs told everyone to think anyway.

Other signs of the death of the PC enthusiast are littered across the Internet like the tattered remains of a kite breaking up on rentry. The birth and demise of AMD's Quadfather, the ubiquitous lack of support (or interest) for quad-GPU graphics, failed physics processors and inconsequential sales of "killer" network cards.

In a recent conversation with Jon Stokes, both of us agreed that while PC tech has seen some great growth over the last few years, this growth is not keeping pace with the Internet as a whole. PC technology, as a journalistic discipline, is unfortunately niched to the degree you'd find with muscle cars.

This leads me to answer the question I started out with: the PC industry, as a whole, just isn't as fast-moving or interesting anymore. Attempting to debate the merits of largely intangible technology topics is a discussion more akin to politics than science.

You bet I'm excited about CPU-GPU integration and new OLED technology, but another unfathomably high frequency bump in the sea of JEDEC memory timings completely fails to pique my interest. Analysis of Google Keywords would indicate those more mundane markers of progress in the PC industry fail to grasp even the smallest of demographics on the Internet as well.

That does not discount the importance of the tech enthusiast. Those of us who grew up debating the merits of CPU architecture in the 1990's are the pioneers in virtual discussion. We are what the majority of consumers will become over the next decade when new, broader forums come to be.

Don't worry, I'm still the first person in Taiwan with Intel's next-generation roadmap. However, as this industry withers and new ones blossom, I encourage you all as pioneers and enthusiasts to look beyond the chips and bits once in a while.

Comments

Threshold

Username

Password

remember me

This article is over a month old, voting and posting comments is disabled

I happen to view Vista as a much bigger step from XP than XP was over 2000. And, since my computers run Vista just as well if not better than they ran XP (including an older 3.2 Ghz p4) I wouldn't call it "bloated" by any means. Yes, a machine barely capable of running XP won't run Vista Premium, but a reasonable machine (I've had 2 gigs of RAM for a few years now on all machines) will do just fine.

to answer some of your points

4) I've heard so much fear-mongering over Vista's DRM schemes, but have not actually encountered any problems related to it in almost a year of usage (Yes,I bought a copy of Ultimate on day 0) -- I wish people would stop blowing smoke about this.

5) Vista 64 exists and has solved the 32-bit memory addressing issue. The problem is not that MS hasn't pushed it hard enough, it's lazy third-party software writers who haven't bothered to update their software for 64-bit compatibility. And how could Vista SP1 solve this issue? If MS dropped 32-bit support entirely, they would lose millions of customers who would complain that MS is trying to "force an unwanted upgrade on them" -- sound familiar?

You may want to actually spend some time trying to understand the issues you post about.

I constantly wonder why reviewers insist on running 32-bit Vista. Especially with a game like Crysis that supports 64-bit and needs all the resources you can throw at it. Don't bitch about how you can only use about 2.5GB of RAM in Vista 32-bit when 64-bit Vista is available and no more expensive.