It shouldn't be any worse than running any other GP-GPU app on the card. I've been running boinc science apps on my GPUs since the GTX 260 (generally 2-3 years/card) when I'm not using them for gaming. Out of 5 cards I've had one fan failure and one card that failed in an OS crashing way; both occurred after >2 years of use. A few months of hard use shouldn't be enough to cause problems unless you're keeping your cards for well beyond the point they become obsolete.Reply

Minor mistake / correction: the top table lists a GTX 740 and GTX 730, but they're low enough in the stack that Nvidia doesn't give them GTX branding. They're GT. A "GTX 740" also shows up in one of the paragraphs.Reply

Its 5GB and that is a total for the game RAM and the GPU VRAM combined. The rest goes to the operating system and isn't available to the game. Its more likely that 4GB is the peak usable VRAM by a console port, and so far all console ports have need 2GB for their console quality graphics and only required more for the special PC graphics. Thus its not a driving force for GPUs, PC games is the driving force for more VRAM and it will increase as it always does generation to generation.Reply

There's plenty of 60hz 4k gaming monitors available, there's even v-sync versions coming in the Autumn, As for Brightcandles so far all console port statement, What console ports? There's only been one so far that was primarily a next gen build and that was Watchdogs and it used around 3.5 gb's at 1080p, The unified ram in the next gen consoles does leave us with the possibility that the next gen ports may be ram hogs unless there reigned in. But nothing is known for sure as we've only had one example so far.Reply

What console ports? There's only been one so far that was primarily a next gen build and that was Watchdogs and it used around 3.5 gb's at 1080p on the PC, The unified ram in the next gen consoles does leave us with the possibility that the next gen ports may be ram hogs unless there reigned in when ported, But developers are lazy and as we saw Watchdogs was an example of a port that's not optimized for PC properly. But nothing is known for sure as we've only had one example so far.Reply

The difference between a PC and Console is that a console has a shared memory pool, PC has dedicated memories for specific devices.

Essentially, with a console that 8Gb of memory will *never* be used completely for graphics duties, the GPU's are simply far to under powered for that.Conversely, something like 3-3.5G of Ram is used exclusively for the OS and other tasks, that leaves 5-4.5Gb of Ram for rendering, A.I, Caching, Sound, Networking, you name it.

A modern high-endish PC will have a 3Gb+ video framebuffer with 8-16Gb of system Ram, usually the OS will gobble up roughly 2Gb of that system memory, with the games non-graphics assets chewing another 4Gb+ in a demanding scenario.The only time 3Gb should be an issue is with poorly made console ports and running at 1440P and higher resolutions.Reply

What? First the consoles SHARE that memory between the system and gpu so 8GB is nothing. Second is that the consoles can't do 4K at all. They can barely do 1080p at 60FPS. Third, 4K gaming is not "taking off". The cards required to run 4K games at a decent quality are too expensive.Reply

I won't 4k game in the foreseeable future, still, with texture sizes growing, more vram is probably nice for future proofing in that regard.The other aspect is GPGPU, where more ram hardly ever go amiss. So I'd very much appreciate more vram as a default on all future cards both desktop and mobile...Reply

VRAM requirements are going to continue to rise for reasons outside of improved visuals; namely that it's cheaper for devs to require more VRAM than it is to work on any targeted optimization. Just look at the last generation with 512MB of total RAM and the relative low quality of ports on PCs with 4GB+ RAM and 2GB+ of VRAM.Reply

There was essentially a fire sale on ebay for R280/R290 cards about a month ago thanks to coin miners selling off. I managed to snag a 280X (with custom cooling and factory overclock) for $150. One hundred and fifty.

It was used apparently but looked brand new, though out of the box. Works perfectly. I wish stuff like this happened more often.Reply

The price/performance approach of this article is a little narrow minded IMHO - of course AMD will 'win', but as a long term AMD owner I must say - driver quality actually matters! Hence I would suggest anandtech to take the 'price/performance and quality' route, or has the deal with AMD (you know, the 'AMD center' is paid right?) blinded the writers?Reply

The price/performance approach is not so much narrow minded as it is quantifiable. You can't quantify "quality" the way you're implying.

I do not have a problem with AMD drivers, so where you might give AMD an "F", I might grade them an "A". I would give Nvidia drivers an "A" as well, though they are not perfect either.

If you have the knowledge base to make a better informed buying decision for your needs and Nvidia fits your bill, then there's nothing wrong with that. For others that are looking for a simple measure of "bang for buck" that doesn't require in-depth knowledge, AMD wins......for now.Reply

Exactly. There is a reason AMD cards are cheaper. If they had better cards they could sell them for more. Price per FPS is a terrible benchmark. You can strap a rocket to a golf cart and make it faster than a corvette, yet no one would ever recommend the golf cart over the vette. Reply

I'd have to agree. I switched from nvidia because of driver problems and bsod's back in the ut 2004 days.

Using amd ever since (currently running 270x,265,7850,6870 in my gaming pc's) and my kids play lots of titles on steam and origin without problems. A few months ago a beta driver broke minecraft, but their next beta fixed it. That's the only problem in recent memory that affected us.

Dstar, You got me for a couple minutes until I realized you were referencing a around 15 year old card:)Brings back fond memories of my SLI Vodoo2 8mb cards with a p2 300 gateway computer That I had to add 2 80mm fans in the side panel to keep them from locking up from heat. I also remember how it was such a bit hit at a Warbirds convention we had to run to the local Circuit city in North Carolina and buy every 80mm fan and grills they had along with a stop to a hardware store for a drill and a jigsaw. I think I did 10 side panel jobs and around 20 other people ended up finding more fans and doing there cases also. Ahh the good old days.Reply

the 7790, aka 260X is a decent living room gaming card ... before a spare 720p TV moved out of my current arrangement, i was getting ~220fps on SSFIV. a few minutes in DXHR and Ass Creed seemed perfectly smooth, tho neither game really grabbed me.

moved the computer to a larger, 1080p TV and had to bump SSFIV down to 2xAA from 4 or 8x but still am getting over 120FPS. and what i thought was ~40FPS in Saints Row 4 (all maxed except SSAO off and shadows on medium) was 25-30FPS, and shockingly felt "smoothish"Reply

I disagree with the guy that said 4k gaming is taking off and new cards should have 8GB. 4k is just beginning to hit mainstream and will probably remain prohibitively expensive for at least another couple of years. I think whether people like it or not, 1080p will remain the dominant resolution used by the masses at least until the end of the current console cycle(another 6-9 years). As to Ryan's question, I can say that the whole Watchdogs 3GB for ultra thing has definitely made me wait for Nvidia's next offering. I had wanted to get a 780 but now I don't feel I'd be future-proofed for 2 years with the 3GB standard. I think for me 4GB is now the minimum and 6GB is the max. The consoles can't utilize more than 6GB so I think you'd be hard-pressed to ever be VRAM bottlenecked with 6GB.....at 1080p. Which again is what I think MOST gamers play at.Reply

In theory I wouldn't mind buying a new GPU for 24/7 Folding @ Home to replace my GT 430 in a low end system that can't deal with much power draw...but what's astonishing is the GT 430 looks like it's still basically the same card as at least one version of the GT 730! Now that's some serious rebadging LOL

Seems like there ought to be a Maxwell part that actually massively outperforms my Fermi based 96 core GT 430 in the same power envelope...I hope.Reply

Vram amount is why before this past christmas I had to upgrade my 3+ year old 2600k@4800mhz systems SLI'ed 1gb EVGA 560ti SC cards with a pair of 4GB EVGA GTX 770 Classifieds.They perform outstanding and I am presently saving up for the 34" LGUM95 21:9 3440-1440 ultra widescreen IPS monitor. I am going with this resolution because it is only 2.4 times the pixels of 1080p and not 4 times the pixels of a 4kUHD panel also I love the 21:9 widescreen format for far more peripheral vision along with tons of desktop real estate. I read the review on the monitor here on anand and the backlight problems are being addressed and if you purchase one make sure the date is says at least june production on the monitor. The April models have the backlight problem. But you never know there might be a better/cheaper 34" 3440-1440 resolution monitor out by the time i can afford it, but as of now my heart is set on LG's UM95. Those looking for 34" of ultra widescreen bliss that cannot push as many pixels LG also has the UM65 a 34" 2560-1080 panel for around $300 less. $599 vs $899 Never know I could be happy with the lower resolution model but I would prefer the 1440p model vs 1080p one.Reply

So... What about the Titan Black? That is hands down the single-GPU king. I can understand leaving out the Titan Z, as the price of that thing is just utterly rediculous, but I should think that the Titan Black should be on the list especially if you absolutely do not want (or due to space requirements, don't have the room for) more than 1 card.Reply

The Titan Black won't offer significantly better gaming performance compared to the 780Ti. The main difference is more double precision floating point compute power and more memory (6GB vs 3GB), neither of which usually matters much in gaming.Reply

Some of us do actual work using graphics cards. Not a single word on rendering capacity of these cards. If you own Adobe cs6 or below, amd is worthless. If you use Blender, cuda is your only option....and fermi trumps kepler.

How about a graphics card roundup from a perspective of people who use these graphics cards for work or rendering video for small business. Not all of us can afford a $3000 graphics card.

To be honest, I am sick and tired of all the talk about graphics cards and gaming. People have been doing video, modelling and other types of high quality rendering for a number of years now, yet have been completely ignored except for a benchmark or two.Reply

Have you watched youtube lately? How many rendered videos by the average Joe are there? There has to be millions. Yes, I do agree that gaming is a huge market. But with AMD and Nvidia releasing what seems to be 5-10 cards every time they do some little update (even just a fake number update) one would think they would do some tweaking of one of their cards to strengthen video/modelling type rendering and market it as such.

A dude/dudette that even does small wedding/events type videos (of which I am not one) would benefit from such a video card......and likely can't afford a workstation like Quadro card.Reply

You're a fringe share of the market; there are publications dedicated to just this and this is where you should roam. Combining, comparing, and contrasting rendering with gaming in the same article is like Road & Track magazine illustrating the performance comparisons of a Porche 911 with a Ford F-250. It would be whiplash....Reply

"Finally, the GTX 780 Ti in SLI is also going to be a viable alternative here. From a performance perspective it will trail the AMD setups by 5% or so at 4K, so while it can’t match the AMD setups hit-for-hit it doesn’t significantly fall behind, making it practical to get similar performance in the NVIDIA ecosystem."

What's all this then? Unless you pick games with that very ridiculous heavy AMD favor (Which are jokes) the 780ti SLI beats the 290x CF in power everytime. If you use non-reference, then it still wins. On reference, it still wins. The only thing that beats 780SLI by number of gpus is the 295x2, which actually costs around 125% as much as a 780 ti SLI configuration, and has a mess of other downfalls, for like a 1-6% performance advantage.