So I've been thinking that we gerbils at TR sometimes lose track of reality when recommending/purchasing GPUs. I fully understand the need to leave headroom for future games that are more graphically demanding but very few of us can see into the future of GPU needs, especially with new consoles on the horizon. TR does a good job at choosing games, resolutions, and detail levels appropriate for the cards they're testing, but they can only benchmark so many games and settings. Furthermore, not everyone plays the most graphically demanding AAA titles that lend themselves so well to graphics reviews; and not everyone needs to play games at Ultra/Extreme/Ultimate detail settings and high levels of AA.

Let's have everyone chime in with their CPU+GPU, game, video quality settings (including resolution), and fps with FRAPS. You can either watch the number on your screen or if you want to go all out and do it TR-style you can use the FRAPs Bench Viewer. If anyone is using Lucid software to pump up their framerates, please disclose such info. Also, keep V-Sync off, this setting caps framerates at 60fps.

I think people can get by pretty well with a 4850 still. The main problem there is the typical 512MB RAM being inadequate so you need to reduce texture-related settings or you get a lot of stuttering from PCIe texture transfers. I don't have any numbers but I play games on a 4850 rather often. It's certainly well above the popular Intel HD Graphics junk so it will probably continue to be useful as long as developers don't give up on it thanks to AMD's dramatically reduced driver maintenance.

On the other hand, it's nice to have something beefier to play the latest attempt at photorealism. So I have a 6950 2GB. No plans to upgrade that in the near future.

4650 still plays AAA titles at modest detail levels.We're not all that far away from Intel's IGPs being enough for decent gaming - Haswell certainly looks promising for a "free" GPU.

Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.

I got a 5870 a little more than a year ago. Played one game on it and realized that might be the last game I played. The 5870 did a flawless job though. Anyone wants it PM me and lets talk Shame that such a card is never used for gaming.

I upgrade GPUs and CPUs on a revolving cycle; generally I keep my previous GPU when I upgrade my CPU, and vice-versa. Last year I bought a Core i5-3570K (which I have humming along at 4.4Ghz), so I'm still running my prior MSI N460GTX Cyclone 1GD5/OC card. Last year, though, I sort of broke my own rule and picked up a second N460GTX 1GD5/OC card on eBay for just $50. Running both cards in SLI at 810Mhz core, they're individually the same essential thing as a GTX560 (336 Fermi cores, 256-bit memory), so I typically tell people I'm running "GTX560 SLI" to keep things simple.

In SLI, as long as I don't have SLI-specific problems or run out of video memory, I can run basically anything at 60hz at 1080p. These two cards crush Far Cry 3, demolish Borderlands 2, annihilate Skyrim (even with the HD texture pack), and destroy Battlefield 3. They laugh at Blacklight: Retribution and sneer at Dark Souls (with DSFix) and anything less.

The curious part is that recently, my second N460GTX has become kind of flakey in SLI, so I put it in my wife's machine (where it works fine, hmm), and even in single-card mode, my old N460GTX -- nearly 3 years old now -- still runs more or less everything fine in 1080p. I get ~70FPS in Blacklight: Retribution, and while I had to disable ENBSeries, Skyrim still runs vsync'd too. (Using my ENBSeries preset, which I could run at 60Hz vsync with SLI, does drop my FPS to ~20s in single-card mode, though.)

In fact, this single card (@822Mhz since it runs cooler being the only GPU) is so fast, I've recently begun to revise my recent urgency in GPU upgrade. I panicked a little when my second GPU "died", and I was ready to plunk down $350 for a 7950 or GTX670, but I think I might wait further still and pick up a 700-series or 8000-series, especially given the recent driver foibles with the 79xx and the gimpy memory bandwidth on Kepler.

tl;dr I run everything maxed-settings-1080p @ 60Hz Vsync'd on a single GTX 460.

1280 x 720, and before you laugh - that's more than most PS3 or Xbox360 gamers usually get.

I sold that laptop a few months back, but even with crappy GDDR3 it was pumping out almost 60fps in Diablo 3, Mass Effect 3 and Bulletstorm without having to hit minimum details.Additionally, Borderlands 2 and BF3 ran acceptably whilst lighter games such as Mark of the Ninja and Orcs Must Die 2 ran like butter.

I do have a 2560x1440 screen hooked up to an HD7950 but honestly, graphics are overated and quality gameplay trounces even the prettiest sparkles.

Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.

I have a GTX470, and the only game where I can feel a slight lag is in Crysis 2 DX11 Extreme with the high res texture pack. I only feel the lag because I played it extensively before the DX11 patch came out, and it was more snappy then. But other than that, everything runs swimmingly. I probably spent a lot more time on Homeworld anyway.

I don't use the highest settings on Far Cry 3, but then again there is no need to because it looks so good already. The funny thing is, when I first bought the card I played every FPS on it that I missed from 2005-2011, but now I just see it as a huge waste of time. I would use it to fold, but it makes the GUI laggy.

I think this is a great idea. I'll try and make some time to benchmark both my current main machine (sig) and my old machine (E8400, 4GB DDR2800 RAM, 4870 1GB, WD Caviar 640GB) on some of the same games and post the resulting data.

I think this is a great idea. I'll try and make some time to benchmark both my current main machine (sig) and my old machine (E8400, 4GB DDR2800 RAM, 4870 1GB, WD Caviar 640GB) on some of the same games and post the resulting data.

Heh, I just realized the point of this thread was actually to post benchmarks, not discussion. Whoops.

I'll post up some benchmarks later, although I want to comment:

DPete27 wrote:

Also, keep V-Sync off, this setting caps framerates at 60fps.

Assuming that your monitor is a 60hz display, then it caps at 60fps. Given that your monitor can't display more than 60 frames per second (since it only updates 60 times per second), and given that most games' input routines are no longer tied to the renderer framerate (meaning higher framerates don't equal smoother or faster input), is there any real point in disabling Vsync? Shouldn't "reaching Vsync" be the goal, in the end?

You are absolutely correct, but the fact is that FRAPs will report framerates higher than 60fps even if the monitor isn't displaying the extra frames. Yes, V-sync is almost a necessity in some games to eliminate tearing and create an enjoyable gameplay experience, but I'm looking for raw framerates. If you leave V-sync on, a 7950 could appear to offer the same performance (to an undiscerning reader) as a GTX460 (for example). IMO, if you're churning out framerates much above 60fps, you should probably turn up some video settings so you're not wasting your GPU muscle.

I haven't written down my settings and framerates in all the games I've played (I will post some when I get a chance to play again) but I can say that my 6850 (roughly equivalent to a 7770) is still doing an acceptable job with the half-dozen games (including a fair amount of modern AAA titles) I've played in the last year or so. I'm running 1080p and medium-high detail settings pretty regularly and getting roughly 45+ fps.

auxy wrote:

Heh, I just realized the point of this thread was actually to post benchmarks, not discussion. Whoops.

It can be both, but I'd prefer hard numbers so others can use this thread as a supplemental reference to TR and other site's GPU review benchmarks.

I owned a Core 2 Duo E8400, 4GB DDR2, and a Radeon 4870 with 512MB and a samsung tft 1920x1080.Performance was adequate with most games, and i could have made this system last longer, but i upgraded in june 2012.My current system is a Core i5 3550, 16GB DDR3, and a Radeon 7870 with 2GB, same monitor. Games feel much smoother, and I don't think you need more CPU or GPU power right now, unless you need to go to higher resolutions than 1920x1080.

I think people can get by pretty well with a 4850 still. The main problem there is the typical 512MB RAM being inadequate so you need to reduce texture-related settings or you get a lot of stuttering from PCIe texture transfers.

My old 4850 512MB was still surprisingly adequate with textures turned down a bit (best computer-related $99 I ever spent, just before the 5xxx came out), but I wanted prettier faster graphics for Skyrim & less power/heat (the 4xxx, while performant, eats electricity like crazy), so now I've got a 2GB 7850.

The 7470 in my work PC, that's only adequate for low-detail games like FTL and UT2004. Saints Row III needs every graphic option turned down /and/ to be run in low-res mode (1280x720?) for remotely playable FPS.

Before I upgraded GPU+monitor I had a 6670 and played at 1024x768. Usually medium to high settings, could bump it up some but rarely noticed a difference. The only games that ever gave me trouble on those settings were Deus Ex 3 (T&L pipeline issues with all the nighttime scenes) and Mechwarrior Online (because it's not optimized at all).

I don't have FPS data but it was always good enough that I didn't notice it. Which is the point, at least for me.

Rocking a 6770 on an i3 system. Max resolution for my monitor is 1680x1050, and if it can't do medium type detail at that res, I drop to whatever other 16:10 option there is, like 1440x900 or 1280x800. I'm a pretty casual gamer now, and giant, hot, noisy, power sucking GPUs are not for me. I'm not interested in anything that requires my desktop to have > 350w PSU nowadays.

You could build an i5 3570k and fast mid-range GPU like a GTX 660 system that wouldn't draw more than 250W (not overclocked) max...I'm talking unrealistic loads like Prime95+Furmark. In gaming such a system would probably pull 150W-200W.

Rocking a 6770 on an i3 system. Max resolution for my monitor is 1680x1050, and if it can't do medium type detail at that res, I drop to whatever other 16:10 option there is, like 1440x900 or 1280x800. I'm a pretty casual gamer now, and giant, hot, noisy, power sucking GPUs are not for me. I'm not interested in anything that requires my desktop to have > 350w PSU nowadays.

You could build an i5 3570k and fast mid-range GPU like a GTX 660 system that wouldn't draw more than 250W (not overclocked) max...I'm talking unrealistic loads like Prime95+Furmark. In gaming such a system would probably pull 150W-200W.

Anything beyond that is just making the graphics look better, not increasing what can be done with the game. Since photorealism is out of reach, hardware enthusiasts have trained themselves to 'like' increasing levels of AF or resolution or framerate.

Anything beyond that is just making the graphics look better, not increasing what can be done with the game. Since photorealism is out of reach, hardware enthusiasts have trained themselves to 'like' increasing levels of AF or resolution or framerate.

With that attitude, yeah it is. I'll just go play something like BF3 in the mean time, and enjoy the fact that more powerful hardware offers me a better, more immersive experience.

Anything beyond that is just making the graphics look better, not increasing what can be done with the game. Since photorealism is out of reach, hardware enthusiasts have trained themselves to 'like' increasing levels of AF or resolution or framerate.

As you can see where my hardware lacks is in the physics area because my E5200 (even at 3GHz) is no comparison to your 3570K. But with the 650TI the overall performance isn't bad especially when you consider that this benchmark is really a 2K screen from what I've heard.