Allow me to do a quick post and say that this was an interesting perspective that logically makes sense and I can relate to on a personal level.

Our families first computer was a Tandy computer and when we wanted to play games there was no need for really knowing what was inside our computer. We just knew the games would run. Later down the line our second computer ran CD games still as long as we knew the operating system it was no problem. Today our house is flooded with different types of computers and mine is that retired computer boasting a GPU and quad core processor while my parents use a laptop.

Computers have gone a long way but if you really want to invest in programming, movie editting or playing video games efficiently while multitasking you gotta get that top of the line to run those programs. Everything else does not require much at all. If anything people are using computers more but the need and reason they need to use the computer is very minimal now. Check e-mail, write a document, browse the web.

Very nice article, with the unfortunate exception that "sink[ing] $200 into the latest pixel-accelerating toaster oven" is (at this point) even up to $400-500. Assuming you only want one card (and not 3).

The most bizarre comparison is that the 'everybody' computer is now cheaper than a high-end GPU ($299, and 'good enough' for internets, email, and word processing).

but i want to mention this. if you look at the advancements in technology today and where some companies (intel espically) are going it appears as if its all coming full circle. intel, along with nvidia and amd, are working on improving integrated gpus. last i heard intel had plans to replace its integrated gpus with larabee, and while larabee might not be up to snuff with newer cards from nvidia and ati when it launches the fact is though that if it works out it then computers that use a larabee integrated gpu will at least be able to run the games.

you could also blame processors. intel and amd concentraged on the mhz race, making their processors faster. as games required more power fast single core procs just wouldnt cut it, now with fast multicore processors software rendering could very easily make a comeback. the core i7 for example is able to run crysis without a gpu, granted its at the lowest settings and with a horrendus framerate, but it can run it.

I think it could have been avoided if the marketing of graphics cards didn't get so out of hand. In the beginning, it was Voodoo 2, 3, 4, etc as Shamus has mentioned. But then marketing got involved in naming the chipsets. Next thing you know, you can buy a GeForce 3 and it would actually be WORSE than your GeForce 2, because you bought the retarded GS version, or whatever the tag is they came up with that week. That's when the market became unnavigable. You couldn't just say "I need a better card" and find one with a bigger number than the one you already had. You have to research stuff so you don't get hoodwinked by marketing. To GeForce and ATI, I say a big "Fuck You" and good riddance.

Now I just wish consoles and their games would natively support keyboard and mouse input...

Almost Lulz worthy when you consider how many people play WoW and such.

Let me put it this way. When the gaming industry went online a lot of producers realized that for a similar investment of money and time they could take your average Epic RPG with it's grind, cutscenes, character customization, etc... and put it online with multiplayer and then sell the $50 asking prices for the base software, plus membership fees (mitigated by the cost of running servers). Even a fly by night MMO with a decent amount of initial hype could make more money than a purely single-player game.

Thus, you saw most of the big single player game developers go over to MMOs. Joe Ybarra, Richard Garriot, and people like that are all involved with MMOs now and they were THE designers for single player games.

What's more with the time spent on MMOs, your typical gamer isn't going to have time to play many other games, which means they aren't going to buy them. I only play WoW fairly seriously and have time to shallowly play a lot of other games because I'm disabled. When I was working this was hardly the case.

Your MMO player is not likely to go back to single player games, rather when they get truely burned out on their MMO of choice they look for ANOTHER MMO.

Console gamers are admittedly a differant braket, and overlap a bit with MMOers because shallow shooters and such are something you can do in between raids. Heck, some hard core people I know keep Portable gaming systems, or even a full fledged console hooked up to another TV next to their PC for those occasional 45-minute+ raid preperation moments (which admittedly are less common now than they used to be).

It's not GPUs that "killed" PC game, it's MMOs that changed it. How many players does WoW have? Okay and then think about how many are on all of the second and third tier MMOs, and even playing those "Free" Korean MMOs (for which you can buy clothes-shop point cards now at Best Buy).

Will things ever go back? Perhaps. But right now the face of the industry changed. Not wanting to constantly upgrade computers to meet the new generation of games was one element, but do not underestimate what MMOs did which was a much, much more signifigant development to gaming in general I think.

The same thing could happen to consoles if someone actually finds a way to make a really good MMO truely function in that enviroment (many companies have talked about it, few have even tried it seems). Then you'll see most of the console gamers playing the MMOs instead of buying individual games, and then the console market will have a dillema.

I have a voodoo card in my room....I wouldnt say its the exact problem, there are many indie, browser and older games that will happily run on your on board gpu. My dell inspiron laptop will run kinghts of the old republic perfectly. But i know its not the same, who wants to play old games when you could be playing Cod or Stalker? But i still dont thint the pc will ever die as a platform. Mainly because im a PC elitist and cant stand consoles at all.

I remember VooDoo cards...things were simpler back then. Price of competition when it comes to technology seems to be complexity sadly; with each side vying to create the most powerful-& consequently complex-tech they can to dazzle those (few) who understand it, & leave those who dont confused by what this new must have thing is.

While Shamus makes a damn good point, and I largely agree with him, there is still a decent segment of gaming that relies exclusively on computers.How many millions of players is World of Warcraft up to? Thirteen? Fourteen? And not a console in sight.

Just because our beloved GTA games (and others) tend to treat the PC version with a level of disdain roughly equivalent to what Yahtzee feels for Quicktime events doesn't mean that PC gaming is dying. More like mutating. True, we're likely to get some yellow redneck supermutants and severely overgrown homicidal cockroaches, but we may also get some friendly and humorous (if hideous) ghouls out of the mix.

We're also seeing consoles (particularly the 360 and ps3) move closer and closer to PC level functionality: playing movies, connecting to the internet, I've even heard of people installing Linux on their PS3... although that probably sets off the "Nerd!" alarm installed in most brains. And from several reports, the 360 performs Seppuku with the same high frequency as any modern PC.

True, the entry requirements to mainstream PC gaming are high, arguably even higher than those to console gaming, but we (The PC Gaming Master Race) have to distinguish ourselves from those ancient grandmas and yapping kiddies who spend their time flailing Wiimotes around their living room... Right?

It'd be interesting to know how many PC gamers own a console, and visa versa. Both myself and a lot of my mates who I know through LANS own both a 360 and a gaming PC. For people who do gaming and can afford it, it makes sense that you can get the best of both worlds if you own both.

Provided people don't shift from PCs to consoles entirely, and in the long run a population of PC gamers is maintained, then the PC gaming market won't dwindle - it will just be dwarfed by the console market as they become cheap and widely available enough for all to own, much like the mobile phone. Then PC gamers will have the niche developers, which will be far more likely to produce better products than those designed for the console-hordes.

Except The Sims. Like acne scars, it will always infect the world of the PC.

I think that the price of all that super hardware is the main thing that prevents most people from getting into current PC gaming. The components of my current computer ran me up 2500 dollars, and then a hard drive failed so now I have to use the Mac mini.

An aspect that was not mentioned:Console makers routinely take a loss on console sales (or so I believe?). They're drawing on licensing fees based on a sort of monopoly over the development for the system. (And perhaps also making a strategic investment in future market share.) The diversity and interchangability of PC hardware more or less precludes HW revenue from licensing, putting it at a pricing disadvantage. (Granted, coprecessors such as graphics cards are a part of that.)

Also worth mentioning may be that consoles offer reduced development costs, as there are only a few hardware configurations per target console. (Which weighs against the licensing fees...)

There's also harddrive costs, which, while not massive (assuming we don't switch to some more expensive solid-state scheme?), may play an increasingly important role as electronic distribution of games and DLC grows. And in particular if a console ever wants to fully replace a PC.

I largely agree with Shamus on the point that the GPU and the rising wall to entry helped in large part to lower the PC's level of dominance, but it's hardly drooling in a retirement home. The more accurate perspective is to see that consoles have risen rather than that the PC has declined.

Sewblon:I think that the price of all that super hardware is the main thing that prevents most people from getting into current PC gaming. The components of my current computer ran me up 2500 dollars, and then a hard drive failed so now I have to use the Mac mini.

unfortuanetly thats ONLY the case if you buy the best on the market or you buy an overpriced dell or alienware. going for parts with the best price/performance ratio over the absolute best available is signifantly cheaper. for example, a computer with a core 2 quad, 4870 1gb and 4gb of ram can cost as little as $700 and will perform only slightly worse than a computer with the best on the market (assuming you dont go multi gpu with dual gpu cards). truth be told its cheaper to buy a gaming rig now then its ever been, even back in the day where all you needed was a computer to run the game (back in the days where a computer would easily run over $1000).

threesan: yes, 2 of the 3 current consoles (the wii being the exception) were losing money on each console sold. microsoft was losing around $100 when the system launched if im not mistaken, while the ps3 was losing as much as $400 per system sold (i believe it orginally cost $1000 to produce, though it might have been $800, can remember). its through game and pheriphal sales that they hoped to make up for the loss. and even then using the controller to use the browser isnt very friendly, espically when typing. and its like this for EVERY console browser, from the psp, to the dsi and wii.

however, consoles are FAR from replacing pc's. using my ps3 as an example here. the internet browser the ps3 has is barely adequate. while it does support flash 9 many sites that use flash have trouble loading, if they load at all. and even many sites that dont have major issues. the loading times for the pages can be very long and it has a habit of stalling out and even crashing the system frequently. adn even using linux on the ps3 doesent work well as a replacement either. linix on the ps3 has jack shit for software compatibility, for the simple purpose that your using linux, the os for which the least amount of consumer software is developed, on a system that uses a powerpc based processor. your pretty much limited to what comes pre-installed on the os and extensions of said software. one of the biggest attractions linux has, wine which allows users to run windows applications on the os, CANNOT even be installed due to the powerpc processor and the issues associated with running x86 code on a powerpc machine (the fact that its impossilbe without a VERY powerful computer, aka supercomputer). installing drivers for devices is also difficult, again because of the powerpc vs. x86 processor. its a neat little thing to try out, but its hardly a useful replacement for a pc.

final point: while there are multiple configurations for pc's (both hardware and software), with newer apis (dx10 in particular) and microsofts movement toward unified driver models, both of which are aimed at increasing compatibility and lessening the burden on developers, that is changing. dx10 was completly rewritten, and while it in and of itself may not seem important its the changes under the hood that ms made that in my opinion will have people thinking that dx10 was a very good thing in the future. dx10 switched from split pixel/vertex shaders to unified shaders. prior to this developers had to code their games for different cards with different configurations of pixel/vertex shaders and play tug of war with getting it to work. 10 pixel and 10 vertex on one card and then 15p and 5v on another, say you need 15 pixel and 5 vertex, well on card 1 you have to go and code it to work there, while its not required on card 2. with unified shaders developers dont have to code this way, they only have to use what they need and the game will use the shaders as they need them, because the shaders are unified and not fixed developers wont need to put as much work into their games in the future as they have in the past.

Therumancer:Almost Lulz worthy when you consider how many people play WoW and such.

Let me put it this way. When the gaming industry went online a lot of producers realized that for a similar investment of money and time they could take your average Epic RPG with it's grind, cutscenes, character customization, etc... and put it online with multiplayer and then sell the $50 asking prices for the base software, plus membership fees (mitigated by the cost of running servers). Even a fly by night MMO with a decent amount of initial hype could make more money than a purely single-player game.

Thus, you saw most of the big single player game developers go over to MMOs. Joe Ybarra, Richard Garriot, and people like that are all involved with MMOs now and they were THE designers for single player games.

What's more with the time spent on MMOs, your typical gamer isn't going to have time to play many other games, which means they aren't going to buy them. I only play WoW fairly seriously and have time to shallowly play a lot of other games because I'm disabled. When I was working this was hardly the case.

Your MMO player is not likely to go back to single player games, rather when they get truely burned out on their MMO of choice they look for ANOTHER MMO.

Console gamers are admittedly a differant braket, and overlap a bit with MMOers because shallow shooters and such are something you can do in between raids. Heck, some hard core people I know keep Portable gaming systems, or even a full fledged console hooked up to another TV next to their PC for those occasional 45-minute+ raid preperation moments (which admittedly are less common now than they used to be).

It's not GPUs that "killed" PC game, it's MMOs that changed it. How many players does WoW have? Okay and then think about how many are on all of the second and third tier MMOs, and even playing those "Free" Korean MMOs (for which you can buy clothes-shop point cards now at Best Buy).

Will things ever go back? Perhaps. But right now the face of the industry changed. Not wanting to constantly upgrade computers to meet the new generation of games was one element, but do not underestimate what MMOs did which was a much, much more signifigant development to gaming in general I think.

The same thing could happen to consoles if someone actually finds a way to make a really good MMO truely function in that enviroment (many companies have talked about it, few have even tried it seems). Then you'll see most of the console gamers playing the MMOs instead of buying individual games, and then the console market will have a dillema.

>>>----Therumancer--->

This is so remarkably silly.

I... I don't even know where to begin picking at this.

Okay, so yes, WoW has a lot of subscribers. And we're talking a lot, true enough. First of all though... okay, for starters you're confusing cause and effect here. The main reason that MMOs are actually able to enjoy such tremendous popularity on the PC is precisely because it circumvents Shamus's point - MMOS realise that the vast majority of PC gamers don't have state of the art hardware. This is why there have been more copies of WoW sold than X360s - also why Crysis, for instance, sold so poorly in comparison.

WoW is succesful because it appeals precisely to those hundreds of millions of people that already own PCs even if they're not gamers.

The assumption that PC gamers aren't buying more games because they're all too busy playing WoW is the silliest thing I've heard all month. I'm sorry, but it just doesn't work that way.

MMO players play MMOs. But - a. it doesn't prevent them from from playing single-player or competitive multiplayer games on the PC if they've got the hardware to and b. You know, not all PC gamers play WoW >.>

I really don't have the numbers here, so I hope someone will back me up.

But this is essentially the same as arguing that the X360 is selling poorly because everyone's too damned busy playing the Wii, which is ridiculous. The two have massively different audiences. Like the Wii, WoW gets such a large player-base because it's so damned good at ensnaring non-gamers and casual gamers. It's not a case of gamers who would, otherwise be interested in playing, say, Gears of War, or Oblivion or whatever, deciding that they don't need to look for a gaming fix past WoW - just as it's absurd to say people don't buy Halo more because they're content with waggling their Wiimotes in Wii Tennis.

9of9:The assumption that PC gamers aren't buying more games because they're all too busy playing WoW is the silliest thing I've heard all month.

All month?

Anyway, why is this a silly assumption? Or: why is it any sillier than assuming that MMO players and players of, I guess, all other PC games (is this really what you're saying? really?) constitute two completely exclusive groups?

9of9:Like the Wii, WoW gets such a large player-base because it's so damned good at ensnaring non-gamers and casual gamers.

You're right: it would be good to have some numbers to back this up. I'd be very interested to see the test used to determine the categories a representative sample of WoW players fall into.

The console is simply too simple. It's too damn efficient. You don't have to worry about this, that, and the other thing. RAM, or cards, you just pop the disc in and play that sumbitch till it red-rings or finishes your grilled chicken (In the case of the PS3)

Now that the gaps between consoles and PC's are getting smaller the problem is the PC is too complex a monster for the largely computer-fluent world. I know I can't put a PC together. I've got one but my sixty-two-year-old dad knows more about doing it than I do.

Shamus Young:The sad thing is, I don't see how this could have been averted. What was NVIDIA going to do, not sell graphics cards? Should gamers have not bought them? Should developers have just ignored the advanced hardware? Everyone acted rationally. Everyone moved forward because they wanted the PC to prosper. And yet, the changes they brought about ended up building a wall around PC gaming.

I'd say it is the same problem that showed up in almost any other area, too.Companies went on to offer highly proprietary API's, chipsets and drivers to diversify themselves from other vendors. It was exactly at the point when you had to start to compare if the game supports the ATI or NVIDIA stuff.

If they had kept to standards (i.e. OpenGL) and tried to develop those, things would have gone a lot better for PC gamers.

Now I still use my PC for gaming, but it's only for the old stuff that I missed (or miss again, like Fallout). The rest happens on the PS3 and I don't intend to change that in the near future.

To put it in analogical terms, the life of the PC is like an RPG. Towards the beginning of the game, most characters can do most things with a passable efficiency. Warriors can hurl spells for decent damage, a Mage can take a number of hits before finally dying, and an Archer can deal just as much damage in melee as he does with his bow and arrow.

That was roughly 10 years ago for the PC, and now the PC is at the higher levels of said RPG where only the Warrior who's put points into Strength and Endurance can take hits without dying quickly, the Mage is only useful if he's casting a spell, and the Archer would get laughed at for even considering unsheathing his sword. You can build some powerful computers that can do some impressive things, but if you haven't invested in your magic stats, then you better not be casting Fireballs.

I guess the influence of the PC depends on where you live, in Germany it still gets the most shelf space out of all systems.

That modern low end PCs can't run games at all is more the fault of the games by requiring shaders and such to run which integrated chipsets often don't have. Besides, they can still run flash games on the internet which are a pretty massive market right now.

ratix2:threesan: [...] and even then using the controller to use the browser isnt very friendly, espically when typing. and its like this for EVERY console browser, from the psp, to the dsi and wii.

however, consoles are FAR from replacing pc's. using my ps3 as an example here. the internet browser the ps3 has is barely adequate. [...]

You describe a software deficiency, which, while real and blocking console evolution, isn't the same as what the hardware is fundamentally capable of doing. Maybe the Xbox 1080 will sport significant in-box storage (assuming blazing-fast internets doesn't replace this, as _some_ are predicting) and run Windows 8. Then maybe you've got a general purpose, uniformly mass-produced computer in a sealed box--which you could offer at the console price reduction (fueled by licensing development and lowered dev, & manufacturing? costs) for the system. Maybe consumers with le money and le know-how would prefer to assemble a system à la carte, but developers could experience massive market pressures to migrate towards developing for the few big-market console-computers, with ports into the inhomogenous olde-computer market being unrealistic.

I suppose I'm raving now, but maybe one of these big-market console-computers would follow more in the Linux line, and up the price of the box (at the cost of market share?) in favor of license-free development...

(Adendum. Two assumptions I'll make explicit: 1) Driver model unification can help, but will never achieve a complete abstraction of underlying hardware. And 2) Operating systems aren't cheap, but you'd need one regardless of your choice of olde-computer versus console-computer, and as such the price is safe to ignore. Actually, that might be a dangerous assumption. A significant portion of the market may not want or need an OS in their console.)

Possibly unseen angle: One advantage/disadvantage(depending on your point of view) of the pc over the console is that pirate games are more readily available from torrents etc. and this variable that doesnt affect the console industry arguably creates aloss of revenue for pc game devolopers who have less and less incentive to create games for the pc especially when ,as others above have said, since the average pc owner cant play most of the new gen games. Pirateing insnt really a problem for console devolpers.

As a Pc gamer myself I find my self looking further and further it the past pre-2006 for good games that i may have missed out on playing that i know will run on my pc now a days.

Bete_noir:The console is simply too simple. It's too damn efficient. You don't have to worry about this, that, and the other thing. RAM, or cards, you just pop the disc in and play that sumbitch till it red-rings or finishes your grilled chicken (In the case of the PS3)

Now that the gaps between consoles and PC's are getting smaller the problem is the PC is too complex a monster for the largely computer-fluent world. I know I can't put a PC together. I've got one but my sixty-two-year-old dad knows more about doing it than I do.

That's a pretty good point. To hook up an Xbox, you don't need to know a thing about what's inside it. When you buy a console, you know that provided the name on the console matches the name on the game, you're in business. Plug power cable in, plug AV cable in, good to go. I imagine it also makes games a little easier to develop because you know the exact system you're going to be working with and what specifications to cater to. I imagine PC developing is a crapshoot where you place the bar... "here" for system requirements.

I find Therumancer's counter-argument somewhat amusing: WoW is in part so popular because you can play it with a very plain machine. You don't need a fancy gaming computer to handle the ol' World of Warcraft.

dnadns:I'd say it is the same problem that showed up in almost any other area, too.Companies went on to offer highly proprietary API's, chipsets and drivers to diversify themselves from other vendors. It was exactly at the point when you had to start to compare if the game supports the ATI or NVIDIA stuff.

If they had kept to standards (i.e. OpenGL) and tried to develop those, things would have gone a lot better for PC gamers.

Erm...I'm not sure what rock you've been living under, but we haven't seen games that *require* ATI or *require* Nvidia specifically to run for years now. This is all thanks to that magical little thing called DirectX that Microsoft spent a lot of time flogging a while back, if you'll recall. Sure, there have been "plays best on..." labels slapped on games for a while, but that's mostly marketing hype. You might have a bit of a point if you're talking about Linux gaming, as ATI cards have basically not worked on Linux for years, but let's be honest, the state of Linux gaming has been in sorry shape from day one, and we can hardly lay all the blame on the manufacturers for that one.

Me, I place the blame for the segmentation of the market on Intel, and their awful, awful, AWFUL integrated graphics chips. They were selling their chips cheaper than anybody else, and bargain PC makers bought them by the boatload. Thus, the baseline system wasn't keeping pace with what game devs needed to produce a decent-looking game.

dnadns:I'd say it is the same problem that showed up in almost any other area, too.Companies went on to offer highly proprietary API's, chipsets and drivers to diversify themselves from other vendors. It was exactly at the point when you had to start to compare if the game supports the ATI or NVIDIA stuff.

If they had kept to standards (i.e. OpenGL) and tried to develop those, things would have gone a lot better for PC gamers.

Erm...I'm not sure what rock you've been living under, but we haven't seen games that *require* ATI or *require* Nvidia specifically to run for years now. This is all thanks to that magical little thing called DirectX that Microsoft spent a lot of time flogging a while back, if you'll recall. Sure, there have been "plays best on..." labels slapped on games for a while, but that's mostly marketing hype. You might have a bit of a point if you're talking about Linux gaming, as ATI cards have basically not worked on Linux for years, but let's be honest, the state of Linux gaming has been in sorry shape from day one, and we can hardly lay all the blame on the manufacturers for that one.

Me, I place the blame for the segmentation of the market on Intel, and their awful, awful, AWFUL integrated graphics chips. They were selling their chips cheaper than anybody else, and bargain PC makers bought them by the boatload. Thus, the baseline system wasn't keeping pace with what game devs needed to produce a decent-looking game.

;-) (but that does not really have something to do concerning the gaming as there is no such thing on Linux)

I do know that driver issues and DirectX support changed over time, but the issues back then were enough to still keep me from going back to hardware-demanding PC gaming. But I was actually referring to the time Shamus mentioned and not the here and now.

The problem described in the article is that segmentation took place already somewhere in the past. I was merely trying to point out that all was well as long as there was basically just one thing to look at - the number after "Voodoo".Others also tried to offer their own solutions, too, but were not really successful until ATI got into the ring. (I still remember owning a Matrox M3D accelerator card which was supposed to be the best of the...whatever. Sadly the only supported big title was Tomb Raider 1 and I can't even recall another game for that)

Even if DirectX offers a more abstract layer for programmers, it wasn't all just marketing back then and developers didn't seem to had the nerve to support several different routines to achieve the same level of eye-candy.

I am not that much into the whole thing anymore, but I am sure that there are games which do need a certain chip to run on maximum settings. I would be really surprised if NVIDIA, ATI & co. did create intelligent hardware that makes specific shader programming using proprietary commands obsolete. So either developers program the same routine for different chipsets by now or the card manufacturers went back to only competing via GPU speeds instead of proprietary extra algorithms.

But maybe the whole thing comes from a wrong perspective altogether and PC gaming declined with the rise of HDTV resolution and online gaming for consoles.

Good article, but I'm not sure I completely agree with it. It seems you're saying that the PC "killed" itself, rather than being "killed" by consoles, and that's the bit I don't agree with.

I'm a PC gamer through and through, and I doubt I'll ever switch to consoles (at least not until they start shipping them with mice) but even I have to admit that consoles are cheaper, more accessible, and more robust and stable than PCs. It's also easier to develop games for consoles, and much easier to QA them. I personally don't much like console controllers but I can see how many people, especially kids, would prefer them to a keyboard and mouse - especially for certain types of games (fighting games and modern variations on "the platformer" spring most readily to mind). And IMO none of these things lose any of their significance when I factor into the equation the fact that PC's have become divided into "normal" and "gaming" rigs since the invention of the GPU. I'm not saying that the invention of the GPU played no part, but I really don't think you can point to it as the primary cause (let alone the sole cause) of the PC's downfall.

Some more (somewhat related) comments, if I may:I said I'd never switch to consoles until they start shipping with mice, but that's an oversimplification. Although I'm not keen on (current) console controllers, my real problem with consoles is actually the games themselves. I think I'm not alone in this, given the number PC gamers who describe console games (and console ports) as "dumbed down". But consoles have only (relatively) recently risen to the forefront of gaming, and gaming itself has only recently risen to the forefront of the entertainment industry, and so now we enter into an interesting time: the console gaming generation are growing up, and I wonder whether they will grow out of games or not. If not - meaning, if a significant number want to carry on playing games into their adulthood - then we may see a shift in the consumers towards console games that aren't so "dumbed down", and then it would only be a matter of time before the industry responded to that shift. If and when that happens I just might be tempted to make the jump to console land.

HeartAttackBob:While Shamus makes a damn good point, and I largely agree with him, there is still a decent segment of gaming that relies exclusively on computers.How many millions of players is World of Warcraft up to? Thirteen? Fourteen? And not a console in sight.

Just because our beloved GTA games (and others) tend to treat the PC version with a level of disdain roughly equivalent to what Yahtzee feels for Quicktime events doesn't mean that PC gaming is dying. More like mutating. True, we're likely to get some yellow redneck supermutants and severely overgrown homicidal cockroaches, but we may also get some friendly and humorous (if hideous) ghouls out of the mix.

We're also seeing consoles (particularly the 360 and ps3) move closer and closer to PC level functionality: playing movies, connecting to the internet, I've even heard of people installing Linux on their PS3... although that probably sets off the "Nerd!" alarm installed in most brains. And from several reports, the 360 performs Seppuku with the same high frequency as any modern PC.

True, the entry requirements to mainstream PC gaming are high, arguably even higher than those to console gaming, but we (The PC Gaming Master Race) have to distinguish ourselves from those ancient grandmas and yapping kiddies who spend their time flailing Wiimotes around their living room... Right?

LOL, good post. But I have to point out (sorry) that "Quicktime" (capital Q, one word) is a bit of software made by Apple, and "quick time events" is what you meant to say. I figured that (as a member of The PC Gaming Master Race) you'd want to know. ;)

A nice article and some good points. Seems to fit pretty well with my own experiences. My computer is most definitely a "regular" computer, and I am for the most part a console gamer because I know my 360 will be able to run the games I want without fuss and they'll look pretty. When I play on my computer it's because I want to play a game that either hasn't been released on a console or simply won't play well without mouse and keyboard, mainly strategy games (unlike a lot of PC gamers I have no problem with console FPSs).

onelifecrisis:My real problem with consoles is actually the games themselves. I think I'm not alone in this, given the number PC gamers who describe console games (and console ports) as "dumbed down". But consoles have only (relatively) recently risen to the forefront of gaming, and gaming itself has only recently risen to the forefront of the entertainment industry, and so now we enter into an interesting time: the console gaming generation are growing up, and I wonder whether they will grow out of games or not. If not - meaning, if a significant number want to carry on playing games into their adulthood - then we may see a shift in the consumers towards console games that aren't so "dumbed down", and then it would only be a matter of time before the industry responded to that shift. If and when that happens I just might be tempted to make the jump to console land.

Interesting idea. However, I'd like to make the point that a great many console owners are already adults. Now, for whatever reason, consoles are the platform of choice for the mainstream gamer. This means that by their nature consoles offer games that are simpler and easier to get into (not necessarily a bad thing, it depends on your tastes). I'm not sure how relevant the age demographics of console gamers are. I'd also like to make the point that the complexity of a game's interface and the commands that can be input, and therefore the complexity of the game itself, is necessarily limited by the fact that most console gamers will always play their games with a simple console controller. This might be a factor that does not depend on intentional "dumbing down". There's still no excuse for not releasing Shadow of Chernobyl on consoles, though. Grrr.