I’ve been expecting this, but it still came as a disappointment when I saw it. My graphics card (NVIDIA 6200) is below the minimum requirements. Looks like the game requires a $120 graphics card just to run. I’m sure for $120 the game will be choppy and look worse than games that came out four years ago. If you want a good play experience, and if you want the game to look anything like the preview movies they’ve been cranking out, you will most likely need to spend about $350. Meh. I don’t even care anymore. Even if $350 fell into my lap right now, I just don’t have the desire to keep going on this stupid graphics card treadmill.

The rhetorical question: Is there some law that says we have to crank up the system specs each and every year? My computer can run Doom3 without a hitch, and the game still looks fantastic. So why do we need eight times more computing power? I’ve seen the preview movies, and while Bioshock does indeed have charm, I can’t say it looks eight times better. Game developers: Give it a rest already. Just perfect the technology you’ve got in front of you. Bring the load times down. Keep the framerate smooth. Minimize how much screwing around the user will have to do under “video settings”. Squash the bugs. Stop trying to re-invent the wheel. Nobody cares about your Trifiltering Bling-Maps and your Derasterizing Shine Buffers.

Even more troublesome is the last item on the list. I haven’t forgotten the time I picked up Half-Life 2 on release day, and spent two hours jumping though Steam’shoops. I’m not eager to repeat that process. For a while I wondered if it wouldn’t make more sense to buy a game, leave it unopened, and get the hassle-free warez version online. Then I realized that would just be a different sort of hassle.

Days are short, my gaming dollars are limited, and I have better things to do than beta-test buggy, unfinished games while performing their useless anti-piracy calisthenics. I don’t care about the game’s lineage, or the design team, or the fact that it’s the “spiritual successor” to my all-time favorite PC game. PC games have been getting more expensive, more troublesome, more buggy, and less fun for about the last seven years.

Once in a while I hear people crying about the death of PC games. They talk about casual gaming, aging fans, and shrinking shelf space. But nothing is “killing” PC games. PC games are committing suicide.

(Also, I know lots of people are going to suggest “get an XBox 360”. That’s pretty tempting, and I may do that at some point, but I can’t bear the thought of playing an FPS on a console. I’ve tried it. It’s like trying to dig a trench with a spear: You can probably do it if you really want, but the whole time you’ll wish you had the right tool for the job.)

On the upside, I discovered a battered and overlooked copy of Rise of Nations (2003) for $5 in the bargain bin and I’ve been having a blast with that. I’ve also been playing a lot of Playstation 2 games. They’re hit-or-miss, but I have to give them credit for being free of bugs and anti-piracy hassle. Recently I had a great time with Jade Empire, but of course that was a port of a succesful two year old XBox title.

It’s a shame that the only platform that ANYONE can publish ANYTHING on without looking to dark overlords (Sony, Nintendo, and MS) for permission is pretty much sunk.

As for FPS on console, I agree that getting a handle around the gamepad is a pain in the ass at first, but having spent a few months with my XBOX 360 and games like Call of Duty, Prey, Halo, and even 3rd-person actioners like Crackdown, I can say it’s well worth it.

No more carpal tunnel syndrome risk, for one. And I find that the “auto-aim” and wide margin for shot error actually makes me a BETTER shooter than on PC…

I think we hit the law of diminishing returns on graphics processing. It’s not just at the consumer level. On the development side, it’s just as bad. Once upon a time, 2x the effort would yield 2x better graphics. Now… it take 10x more effort (and 10x better hardware) to yield 2x better graphics.

I don’t think the industry knows quite yet how to get off that track. They’ve been on it for decades, and it’s just How Things Are Done. Consumers used to just lap it up, but now even they vote with their wallets, and say, “Screw it, I’m getting an XBox.”

I actually don’t mind not playing the newer games. My computer can’t play anything published in about 3 years, but why would I want to? Every game that I love was made at least 3 years ago, and my computer plays all of them flawlessly.

Starcraft, Fallout, Deus Ex, Neverwinter Nights, NOLF, Diablo.

I made a concious decision not to try to keep up with the modern games as, for the most part, they aren’t worth it. Still, that’s just my opinion.

But then again, I’m not willing to ever spend money on a lousy videogame again. People (Linux people especially) develop better free stuff that runs faster by virtue of not having annoying graphics (I’m thinking of roguelikes here).

And on top of that, graphics are a pain to produce. Those artists deserve every penny they get. Though they probably don’t even get how much I’m thinking they do.

Videogames aren’t that important anyways. Especially when they focus on their graphics. But I will miss not having bioshock. Time to make a roguelike for it! I’m working on it, but not that good yet :/.

“And I find that the “auto-aim” and wide margin for shot error actually makes me a BETTER shooter than on PC…”

That’s pretty much why I prefer PC shooters. I enjoy the increased importance of accuracy. I like the trade-offs in accuracy between lining up shots and taking a snap shot after a rapid turn. If you’re using joysticks for aiming, you’ve pretty much got two options: make aiming easier (eliminating an aspect of FPSers I enjoy) or become slower and less accurate because a joystick is an inferior interface for pointing. You can make great console FPSers for consoles, but in adapting to the strengths and weaknesses of a joystick interface you create a different type of game. A friend crowed about Gears of War. I’ve played some of it and it is fun. But it’s a different type of fun from a FPS played with a mouse. It’s like being a fan of sprawling, simulation heavy western RPGs like Oblivion and being told that more story heavy eastern RPGs like Final Fantasy are somehow an acceptable replacement. Both are great, but they’re not acceptable substitutes for someone who really prefers one branch. (We’ll see if the Wii can provide me with the same pleasure. Unfortunately I believe my options at the moment are limited to Red Steel which has been pretty universally panned.)

I’m also with the Jack; desktop computers are the one truly open platform. A single man with a vision can crank out a game and ship it without needing anyone else’s permission. A development team doesn’t need to pay anyone for the privilege of shipping their game. They can make any sort of game they want, not matter how gory or sexual and not worry about a large business shutting them down. (The government might shut them down as there are some lines you can’t cross, but that’s a different problem.) Admittedly, if they’re unwilling to pay off retailers they’ll only have online sales, but at least it’s freedom for developers.

I’m definitely with Shamus on online activation. I occasionally break out 10 or even 15 year old games to replay them. When I break out Bioshock in 2020, will the company still be around? If they are, will they have kept the activation system around, or might it have been killed after several mergers and the resulting board of directors killing it as “non-profitable” (“Why are we spending money to support a 13 year old game we don’t even sell anymore?”)? Will the game be able to authenticate over IPv6, since IPv4 may be dead by then? Compromises are possible. I dislike Valve’s Steam, but at least you can backup an authenticated copy. But it still sucks as part of why I bought the game at retail is to get professionally pressed CDs with a long shelf life.

As for the upgrade treadmill, it doesn’t bother me so much. It’s part of the two-edged sword that is PC gaming. I tend to buy mid-range cards ever few years. Such a card tends to fair well on cutting edge games for about a year, then limit me a bit for another year. When I upgrade, I go back and get the games I previously missed (usually cheap!).

basically stopped PC gaming other than MMOs about a decade ago. Recently I signed up for a GameTap subscription, primarily to get the Sam and Max Adventures Season 1. But there have been a ton of stuff come out in the last decade I never played that I’ve sampled.

And right now, a 10 year old game looks a little dated, but still quite good. a 5 year old game looks great on current hardware, and typically runs great. I think this is just about the first time in history you could say that. In 2000, games from 1990 looked like crap.

So much worse is when you’ve ponied up the $49.99 for the new game, and spent the $200 – $300 for the MID RANGE graphics card, and maybe $100 for more RAM…only to have the game freeze, lockup, and crash. Of course you can’t get a refund, you might be pirating the game.

Go Shamus! Don’t make the game do more, make the game do well. Don’t give me a cell phone that makes videos, surfs the web, and makes coffee, give me a phone with good audio quality and holds a signal. Etc. etc. etc.

I’ve been on the PC HW leveling treadmill since I first bought a 386-20 to play the original Wing Commander back in 1990. It’s gradually sped up over the years as PCs have grown exponentially faster (or however rapidly they’ve been progressing), but it’s basically been there since at least the birth of VGA graphics. I can pretty much guarantee that every PC upgrade I’ve bought since then has been due to some game which strained the limits of my previous PC.

These are things I’ve been thinking for quite a while now, nice to see someone else has aswell, and put words to them. I remember there was an article in PC Gamer not all that long ago that was talking about the rising cost of games development for PC, they were mentioning things like larger teams and better tecnology and hardware and advertising amoung other things like puplishers being greedy and the consumers(I have no clue about that one either). At the moment they’re busy drooling over Crysis, I was too until, it looks to be an absolutely amazing game, that is until I read a quote from the developpers. They stated that the graphics and other doodads where so advanced that we’d have to wait two years, that’s right, two years, after it’s release before the kind of hardware they’re making the game for would even be available.

The graphics on the Wii pale to that of my laptop, but who cares? The Wii is FUN. Sony and the PC developers better stand up and take a good look at what Nintendo did. They made a fresh game system that is fun to play, works, and isn’t outrageously expensive. Maybe that’s why Wii’s are STILL hard to get and unsold PS3’s are clogging the aisles.

The last new game I bougth was NWN2. It has some improvements over NWN but it is so buggy, I just gave up on it.
I’ve been playing the first NWN since it came out (and there is more free content to play than I’ll ever be able to get through in a lifetime). I’m thinking about dropping back to that.
I just picked up Civilization 3 for $10. I love sim games but I’ve never played any of these, so I’m looking forward to that.
Every once in a while, I pull out Homeworld (and Catacalysm) – they still look fabulous and play as smooth as glass on my laptop.

Eh. I usually just don’t play anything new. Let someone else work the bugs out and ride out the patch cycle, play through the games, and let you know which ones were good. Then, 3 years down the track, buy their used graphics card off them when they upgrade and pick up a copy of the best games for $20 instead of $70.

System Shock was the deal. I can’t even remember which one was which anymore, but I do remember the music winding me up til I would empty precious clips of ammo at dark corners. And those damn screeching open-skulled monkeys _still_ send shivers down my spine and make my wife jump halfway across the room if you just mention em. Bout time to dig those up and play em again…

Wait…I lied. I didn’t read that page very closely. That’s 2 8600 cards. I have a single 8800 for around the same price, with 640 MB of RAM. Less noise, less power consumption and less hassle than SLI setups, and very near identical performance. That 8600 card is really stripped down.

I’ve been a die-hard gamer since the age of six (which happened to be 1981). I started feeling the fatigue that you describe, well, let’s see, it must have been when I got married. ;)

I have so many other more important things to sink money into than the latest so-fast-it-makes-your-ears-bleed computer system. Interestingly, it was this fatigue as much as anything that drove me into the tabletop RPG world about two and a half years ago and I’ve been happier ever since. Of course there are all the splat books and the minis and battlemats and let’s not forget DICE(!!) that are out there begging to be purchased. But I’ve got 5 full sets of dice now and enough books to keep me happy (and enough to realize that I don’t really need those splat books). Anyway, the video card requirements for my imagination to make D&D enjoyable are pretty reasonable. Trifiltering Bling-Maps were thoughtfully included at time of purchase.

I also spend some time playing Rise of Nations from time to time because they actually made a really really great RTS and didn’t worry overmuch about the graphics. Gameplay! Who’da thunk it?

Anyway, thanks again for the fine expression of my own feelings. Good one.

We’re hitting the ceiling as far as graphics are concerned, here. Personally I’m getting a bit tired of it, too. The computer I have now used to be a high-end gaming rig, but is beginning to show its age. Which means I’ll have to update again pretty soon, and while I’ve at least saved up for it (actually, it’s the money I set aside for the last upgrade, which turned out to be very cheap). I don’t like upgrading. It involves at least a day of plugging equipment into the right slots and trying to get it to agree with each other.

Of course, my experiences might be coloured by my father. He is a technologically apt man for one reason; everything around him breaks. This includes computers, which left me with the impression that installing new hardware, software or even changing an option is like descending into the fiery pits of hell.

The thing is, I’m pretty happy with the level we’re at now. I’m playing games that are a year old, or older, and I’m having more and more trouble seeing the age. When we were still at the level of the original Half Life, with about 2 polygons per model, yes, I admit there was room for improvement. But right now, that room is getting pretty scarce. Maybe if they used it for something, but all I’m getting is the next warzone environment. I’ve seen those. We can do that. We’re good at it.

What I’d rather see is that I can take my rocket launcher and blow up a wall. We did that years ago, but somehow all these nearly photorealistic games still have the same old scorch-mark.

And *that* really takes me out of the experience, and it gets worse the better it looks.

This is loosely linked to this thread, but I wanted my own little rant. I use a Mac Mini for internet stuff, so I don’t do any gaming on it. I do own a PS2 though which does very nicely when i get a chance to play it.

One game I like is the Raw vs Smackdown series. I play it solely for the create a wrestler feature, where you can create your own selection of wrestlers, with their own costumes and unique moves, then take them through season mode.

This game comes out every year (usually around Christmas). Obviously they have to think of ways to “improve” on what is basically the same game, with an updated roster of wrestlers and a new set of story-lines.

So what do they do. Well, they sit down and think, wouldn’t it be nice if the wrestlers got sweaty as they wrestled, and if the crowd had more realistic faces and if the locker room was customisable so people could make their own locker room like how they make the wrestlers. Oh yes, lets change the controller layout so all the buttons do totally different things.

The result?

You have to relearn how to play due to the new (not improved, just different) controller layout.

The locker room. Looks nice first time you go in and look around, but then you realise it takes a minute or two to load. All you actually do there in the game is hit save, spend your experience increasing your attributes and occasionally read a message you have received. Is it a menu? No. You have to walk to various areas of your locker room to do each different thing. Several of which have another loading screen to see them up close.

The match. Looks great (though I thought old game looked great too), but you have to wait two to three minutes between hitting “start match” and actually starting a match.

Season mode roughly follows the following pattern:

Go to Locker room (wait while it loads your shiny new paint job).
Go to pre-match cutscene (wait it loads those extra facial details).
Play match (wait while it loads those stupid sweaty wrestlers & dumb crowd). Finish match (wait while it loads the next acursed cutscene).
Back to Locker room and repeat.

Each of the above “waits” is 1-2 minutes, so to play one turn of season mode and one match I have to sit through at least 5 minutes of irritating looped music, staring at a picture of a random wrestler (You can change it so it just shows the Diva’s, but even that gets boring after a while).

Moral: Forget the useless crap. Get rid of the stuff that annoys the players. I will not remember this game for glistening skins or realistic crowd faces. I will remember this game for too much of my precious time spent staring at a loading screen.

Just one point; I’m not sure if people have pointed this out yet. You can get a card that is easily beyond those specifications for about $60 dollars on newegg right now. The 6600 chipset is over 3 years old at this point, and hardly expensive or cutting edge. I don’t think its terribly unreasonable to expect people to have a graphics card that new; at some point they need to move ahead graphically. I remember not too long ago that a card more than a year old was essentially unusable on newer games; at least with current cards you can get a fairly cheap card that’ll be usable for several years, and they’re actually a bit cheaper than cards used to be as well. Not to say that it isn’t irksome, but I’ve had good success buying a low-mid range card every 2 years or so for $150, and settling for turning the graphics down on all the new cards.

It’s all a matter of priorities. Some people have the disposable income that they’ll continue to dump into hardware upgrades to support the games they want to play. At some point, however, developers and publishers will likely realize that their target demographic is shrinking, and that fewer and fewer people will continue on the upgrade grind they’re currently pushing us through.

I refresh my hardware on occasion, but usually only if something goes bad, not for a game. I used to do that though, before kids, before mortgage, basically before responsibility. I had the cash, so why not? Oh well, all good things…

I’m currently using a GeForce 6200, and it’s good enough to play what I want to play. If I have to turn the graphics down, so be it. Who cares that I can’t see all the grass blowing in the wind, or lifelike shadows? I can still enjoy the game and play it for all its worth. All that matters is that I enjoy my game experience.

Not a big fan of the hardware treadmill either. Having the latest and greatest in graphics doesn’t guarantee good gameplay — And to the extent that the graphics requirements edge out lesser machines, or divert development resources, they hurt gameplay.

I haven’t had time to play many games lately, but I’m perfectly happy with say, Unreal Tournament, GalCiv2, Starcraft, Warcraft III, and so on. These games are a few years old but still great fun, especially when doing LAN games, which usually means at least one hand-me-down computer that’ll be a liability with any newer games.

I hit this upgrade wall pretty much the day I began gaming. I’m not rich enough that I can buy these upgrades every x years and I’m not into the hassle of it either.
So I use my PC as a machine to try out all these classics and indie hits everyone raves about (stuck on the first level of Deus Ex with no clue where to go) and use my 360 to experience good graphics for a convenient investment of price and effort. Well, convenient by the standards of a PC. And I still don’t understand why a 360 game costs $20 Aus more than the exact same PC game. Is this price difference in America too?

I upgrade my PC once in a 3-4 years and feel perfectly good about it. Currently, I have AthlonX2 4200 with GeForce6800 (which cost me about 150$ by the time I’ve bought it) and I’m able to run all modern games I’d like to play on good FPS. OK, it’s not shooters or action-RPGs or anything else graphic-heavy: mostly, arcades like RoboBlitz or Bus Driver (a game which I enjoyed SOOOO much!) and turn-bases strategies.

I never ever owned a console and never will. I hate consoles with passion :) (I remember arguing with my friend that ZX Spectrum is loads better than NES, because it has keyboard and I can write programms with it).

Games from big companies are mostyle stillborn, anyway. But there are always indie products that worth looking at: Eschalon Book I from Basilisk Games and Age Of Decadency are good examples. They’re not out yet, but they are the only games I’m eagerly awaiting now (Fallout 3 being mutilated by Bethesda).

I agree with everything you said shamus(for a change).I still am playing (and enjoying) games that are 5+ years old.The only two newer games I have are civIV and UFO:Afterlight.The rest(fallout,deus ex,starcraft,homeworld,….)I replay numerous times and enjoy them more than any of the new perfect graphics/trashy everything else titles.

Although,I must admit that there is one more title that is new yet I enjoy,but considering that its freeware,and made by (so to speak) ordinary people,its a whole different category.I am talking about battle for wesnoth,of course.

Poor fellas, unable to play this, what looks to be an amazing and groundbreaking game, all ’cause you ain’t bothering to play on a decent rig.

Moore’s Law sorta guarantees that y’know…

Youse mugs remind me of people who use 50cc Go Karts on a race track, then whine that the track is not designed for them, that it’s too big, and that those big cars that are lapping them are just a fad. Then you go and buy a 150cc Engine for your Go Kart, and feel gypped when those cars still lap you.

PC Gaming has always been about pushing the limits of the new, simply because in no other field has their been so much growth in power.
33 Million CPU cycles –> 4 Billion+

Shamus, you say the game doesn’t look eight times better so why does it need eight times the CPU? It isn’t running on a 16GHz now is it? Or were you running Doom3 on a 400MHz?
I suppose the amazing AI, running a wee bit better than the old Doom II imps, the beautiful water effects, and graphics that look at least (oh to pin an arbitrary number on a perspective) between three-and-a-half and four times better then Dark ol’ Doom3. And that’s not on the gimped version I bet you play it at, I mean all settings max on a nice big screen at good framerates.

The FPS has been used since ye olde Pentium I to benchmark “cutting edge” graphics and CPU performance. Don’t believe me? It took a Pentium 133 to be able to play Doom with everything at max and many dozens o beasties and no slowdown. And there were PooPoo-ers like you fellas there, muttering about how bad things were and how they’d stick to consoles, in this case the Atari 2600 and good ol’ NES.

Quake II was used for quite a while as a benchmarker, and when Quake III came out, Zounds! Smooth Curves! Particle Effects! It’s the Zenith of CPU gaming!

Only it wasn’t. Not even close.

Doom3, Far Cry, FEAR, Half Life 2, all these and more have been pushing boundaries that people like you guys never even saw. All the while silly naysayers like you have stood and griped. And failed to enjoy the game at it’s best.

Yes games are buggy, but then most times they are released before they are ready, by marketing most times, and are a wee bit bigger than games that ran on floppies, or CD’s for that matter. When things get big, and marketing has the final say, games will be buggy.

I personally find the minimum requirement of 1 GB RAM the most ridiculous of it all. Yes RAM might not be hugely expensive and I’m certainly not an expert on it, but it sounds like an overload. No amount of graphics can make an excuse for these requirements. For those who don’t believe it, look at Unreal Tournament 1 (2001, was it?)

Actually, the 6600 is only $70 at Newegg. Right now I’m using a 7600 GT which are selling for about $100 and does very good on most modern games. If you don’t have AGP, that version is $124 or so.

Right now RAM prices are pretty decent. DDR2 is actually cheaper than DDR, but 1 gb DDR 400 is only $50 for Wintec which won’t get you the respect of your nerdier friends like some Corsair, but I’ve found to be fairly reliable and quite overclocking tolerant.

The trick, really, is to do your research and keep about a year behind the latest generation or so. Usually that’s where prices are the best, and you’re not paying $70 for a card you shouldn’t pay $30 for that the retailer is trying not to lose their shirt on the steep cliff that is electronics prices.

The 7600 gt is very worth it. I was pleasantly surprised to find I could run most of last Fall’s games at near full detail. The only thing that was really holding me back was 768 mb or so of RAM, so I upgraded to 1.5 GB and it’s smooth sailing now.

Pricewatch list a Radeon x1650 512mb for $72.49 as the lowest price, and you can get it from newegg for $74.99.

Now, if you want to go for a graphics card that should last you for another 2-3 years (not being the ‘top’ mind you, but comparitively much better than the 6200 you have), then $146.59 for an x1950GT 512 mb, $144.99 from newegg (although not currently available).

By comparison, that 6200 you have only costs $27.99 now. I really don’t think you should be terribly surprised that something that old (and very low end when it came out) cannot run the latest games.

For about anything on Windows Vista, a gig of ram is probably a good minimum.

I’m using a radeon 9800 pro. Bought it at around the 128 price point (though I needed to add an additional fan to the graphics card to make it run right, and an additional fan to the case as well, which bumped the price up). I haven’t updated myself on price/performance since then, but the card lets me play World of Warcraft and solitaire … ok, I don’t have that much time for either right now, but if I did …

But there is nothing wrong in playing games a year after initial release. You save yourself a lot of frustration and hassle.

Now I need to try Homeworld, given all the positive comments, and should be able to initiate all of the features.

But much of the hardware seems wasted. I played the modern port of Privateer. Now the actual game ran very well on a 486sx. My system (2 gig ram, etc.) has a bit more power than that system did, but choked a little. The differences were all consumed in engine differences, no improvement in play or actual graphics.

On the other hand … there are lots of games that have impressive improvements in appearance and play. But I think you’ve really hit the point when you comment on getting them when they are a year or two old is the way to go.

Herr Schmidt: I think you’re missing the point, which is that the ever increasing cost of upgrading a game’s graphics is making games expensive, bug-ridden, often difficult to play and less enjoyable.

Seriously, it is time that developers took a step back to see that the bigger picture is more than just a picture, it’s gameplay and storyline, and replayability, and sound, and dozens of other things.

Shamus, I feel your pain on the PCIe front. I shared a similar feeling of anger and frustration at having to constantly upgrade my PC. I’d gotten by for almost 4 years without doing so, and I felt the only thing I really needed was a new graphics card, and maybe an extra gig of ram. 400 bucks tops.

I could have gotten an AGP graphics card, but they ones available weren’t that powerful, and it felt like maxing out the upgrade path of my existing hardware was a bad investment. So instead I bit the bullet and spent about 1100 bucks on all new hardware. I needed a new motherboard so that I could have PCIe, which meant I also needed new RAM, because the old stuff was incompatible. So was the processor, so I got a new dual core. Then of course I needed a bigger power supply to handle the new graphics card, and of course the card itself. The only thing I got that I probably didn’t strictly need as part of the ugprade was two new hard drives for a RAID array.

I’m very happy with my purchase though, and I’ll tell you why. I kind of disagree on your point about the never-ending escalation of the PC hardware arms race. A card bought at the top of the line today, will become below minimum spec for games about 4 years from now. This is about the same as the life of a modern console system. Maybe a year shorter.

Developers DO standardize on a platform, but it’s just that we’re in the process of switching to DX10 right now, which means a new class of hardware and a new class of obsolete stuff. If you buy DX10 hardware now, you’ll meet recommended specs until at least 3 years from now. 4 years from now the DX11 hardware will just start coming out and you’ll be on the edge of obsolesence again.

It’s aggravating, I agree, and maybe unnecessary, but I guess I don’t find it as unreasonable as you do.

The good news, is that the industry is not looking for a replacement for PCIe yet, which means in 3.5-4 years it will only be a 150-300 dollar upgrade to get playable again. This cycle was compounded by the fact that the old generation was not just obsolete on the graphics front, but on the chipset side as well.

I generally avoid the upgrade treadmill. I almost always only upgrade when I am getting a whole new computer, which I do every 4-5 years or so.

The only time I have broken that rule was to upgrade my video card in order to play Splinter Cell 4. I have loved all of the earlier games in the series and felt this would be worth the upgrade. While the game itself is great, it was a royal pain to get working. I had slowdown, graphic irregularities, and other similar problems. I actually went out searching for fan-tweaked drivers in order to get a decent experience, which then broke some of the other games I was playing.

So it once again taught me the silliness of upgrading. No more upgrading until the next computer, which is still several years away. Instead, we picked up Xbox 360. That will keep us in new games for quite a while without any upgrade costs.

Is it just me or are fan made mods of old games the way to go lately? You don’t need anything better than what the old game requires, and chances are, good fan made mods are just as good as commercially released games.

I tire of the treadmill as well. We get to that point where playing a game occurs, but with horrendous framerates, even with fairly robust hardware. Sigh…

I just finished Red Steel on the Wii. There’s definitely flaws in the game (Why can’t I shoot that guy? I have a gun. He has a sword. He’s 100 yards away. Shooting him makes way more sense than swordfighting him, and yet here I am, forced to draw my sword. Why? Oh, right, because the plot says I have to.) But it shows that FPS can really work well on a console. And to reload, I simply moved my hands like I was putting in a new clip; I love that. Alan De Smet, there’s also Call of Duty 3 and Far Cry available, and Metroid Prime: Corruption should be released soon.

Eh. I’ve tried consoles, and discovered that while a keyboard and mouse can injure you, so can a console. Damn gamekeeper’s thumb. At least keyboards and mice come in all sorts of different configs to help stave off injury and you can easily customize.
But good luck, and I’ll let you know how Bioshock plays on a less than stellar video card (mine’s a 256 mb nvidia something or other). I know it meets min specs.

Retlor: As I said, it’s not the graphics increasing that makes games bug ridden, and less enjoyable. That factor is time. Marketing pretty much overrides the programmers and that means games are released when it’s economically viable, not when they are ready or bug free.

Developers like money, and (as EA is ‘famous’ for) will release the game when they think (aka Marketing Says) it’s time to do so.

The other big factor is that when making a game you have to make it support a large variety of hardware. Two Major CPU’s, at least Two Major Video cards, and many many variations of motherboards. Code can do strange things on hardware set A, yet work fine on B.

Consoles on the other hand, are very close to identical. Thus it is easier to make a game for it as everyone who owns console X as code can be specialized towards it.

It’s easy to say that a game needs “gameplay and storyline, and replayability, and sound, and dozens of other things.” Retlor, without thinking about the foundations that all those things are built on; Hardware and software.

Well, that and how shallow many people are. You may demand great inner workings in your games, but if you’ve ever known someone who buys an EA sports game (2000, 2001, 2002, 2003, et al.) EVERY SINGLE YEAR, you see the Developer’s Target. They are the masses whom must be catered to.

A final riddle to all, and please consider carefully your response.

Think about how Many game companies, who release brilliant games, innovative games, cult classics, and aren’t sports games or FPS’s mind you. Think about these Zeniths of games, games that have had poor sales, or had too small a niche market, and eventually led to the end of that developer.

Hold that number in your left hand so to speak.

Now think about how Many game companies, releasing Sports Games, FPS’s, rehashes, and games based on movies. Think about how shoddy and poorly made they have to be before people won’t buy them.

Put that number in your right hand.

Weigh these numbers. And tell me it’s the Graphics or CPUs that are destroying the game industry.

I grew up primarily on console games, so playing an FPS on a console is no problem for me. However, playing an RTS game on a console is just plain annoying.

I also find myself going back and playing older games as well as opposed to shelling out more and more cash just to upgrade my system or get the newest shiniest game. For example, right now I find myself playing the Diablo II expansion and having a ball (my Assassin just pimp slapped Diablo)!

If you think it’s bad now, just wait till the directX 10 games start coming down the pike. None of the AGP cards will run them, the systems will require the same processor (P4) or higher, 2Gh of RAM minimum and, of course, Vista. All this for a graphics improvement that the average gamer will never notice. DirectX 10 will force people to either upgrade their computers or use a different platform for gaming. I just upgraded my system so I’m not worried (Core2Duo, 6800GTS 640, 2G RAM) but for people who don’t want to spend that kind of money to upgrade a console will be the way to go for a lot less money. If you want cost effective, buy a PS2, which has a huge library or if you want next gen get a Wii. Heck for less then half the price of a computer upgrade you can get both and a bunch of games.

Ahh, AGP. I’m on that myself, although I bought an X800XT PE off ebay a while ago for not too much money. I’m going to be buying a complete new system in a couple of months and moving out of that.

But even so, it is possible to get reasonable graphics cards for good prices – Rich has already posted the link.

I think your real problem is that you paid x to buy the 6200, but if you’d just paid x + y you could have gotten a card that was easily 2-3 times better, with y being less than 2x.

With the current series of DX10 hardware, the price/performance sweet spot is the 8800GTS 320mb RAM, which is $289 on pricewatch. With that, you should be able to run most games in medium to upper specs at large resolutions for the next 2 or so years, and at medium quality for the next 2 years again after that, if past performance is any indicator.

Back in the days of the 9800 series radeons, you really had to go with a 97Pro or 98Pro to really get sensible performance for your dollar. But nowadays, with all the previous generation cards available, it is quite possible to pick up some decent cards at good prices – the top end of the previous generation or the middle of the current gen usually offer very good value.

If you think it's bad now, just wait till the directX 10 games start coming down the pike. None of the AGP cards will run them, the systems will require the same processor (P4) or higher,

Unless you’ve heard otherwise somewhere, I doubt it. None of the game companies are going to shoot themselves in the foot that badly that they’ll make their games DirectX 10 compatible only, especially with Vista making such a small impact. Games have always made some modicum of backwards compatibility for older DirectX versions; I could run almost any game (albeit horribly) on my old Geforce 2 MX400 up until about three years ago when it just wouldn’t work with some games, so I moved up a little to a Radeon 9250. Now that I can actually afford to spend $400 to do a ‘last year’s new-hotness’ upgrade every year (new/more RAM, new Mobo, mid-range CPU, mid-range vid card), I find gaming to be much more visually enjoyable.

Oh, my friend, we would like nothing better. That’s why you see a lot of sequels out there, you know? The first game is done, the bulk of the tech work is finished, now we can concentrate on polishing everything to a gleaming supershine. Why didn’t you polish the original game, you say? Because the publisher said we had to ship it by X date.

Madjack: This is probably one of the reasons sequels sell so well: They are going to be stable and polished, as opposed to prototype-ish.

I’d be totally happy with a new game that used 1999 technology levels. Another user said something similar above. We turned an important corner around 98 – 99, and the advances since then have been following the law of diminishing returns.

There do seem to be two kinds of development houses:

1) Those who get pushed to release too much, too soon, by the publisher.

Shamus: Myself and plenty of other devs love the idea of low-tech games, stuff that focuses on gameplay and can be run on cheap rigs with old tech. Case in point, the fantastic Dwarf Fortress! You should check that game out for sure.

But not all the public is game savvy, and lots of people with big wallets are focused on “hot graphics, dawg.” Any studio that willingly steps away from the cutting edge, that says “We’re not going to keep racing up the tech tree, we’re going to plant our flag here and make awesome games” becomes an anachronism before the words finish leaving their mouths. At best, you can look to portable systems for that style of game, because– as you pointed out– they are notoriously difficult to fund.

As for PC gaming, some of the best games in recent memory wouldn’t have happened on 1999 era rigs. Ignoring graphics, the intense physics of titles like Half Life 2 became an integral part of the game: combat, puzzle solving, the general stuff of fun. Company of Heroes, Dawn of War, Supreme Commander– those games have pushed the RTS genre waaay far past any place Command and Conquer or Dune were going to take them.

Not every game raises the bar in a meaningful way, true. But that might apply to any time period in gaming. It is super easy to just make more Derasterizing Shine Buffers, Megahurtzes and other silliness, but not so much to really make the technology work for you across the board.

For those who are using this piece as an invitation to push the “consoles all the way / consoles pwn / consoles FTW”-side of the console vs. PC war, I just have one thing to say. Consoles are NOT immune to upgrade creep, so this is not an advantage over PC gaming. EVERYONE who enjoys electronic games is feeling this pinch.