Well you're essentially buying a computer on a card with a CPU these days. High performance GPU w/ high performance, pricey ram, all of which needs high quality power components to run. GPUs are now computers inside of computers.Reply

I think it's simply that GPUs can't get cheaper to the extent that CPUs have, since the die sizes are so much larger. I certainly wouldn't say they're getting MORE expensive - I paid $370 for my 8800GTS back in early 2007, and $400 for a 6800 in early 2005 before that.Reply

High end GPU chips are much larger than high end CPUchips nowdays. The GF110 has 3bn transistors. For comparison a quadcore i7 only has 700m, and a 6 core athlon 900m, so you get 3 or 4 times as many CPUs from a wafer as you can GPUs. The quad core Itanic and octo core I7 are both around 2bn transistors but cost more than most gaming rigs for just the chip.

GDDR3/5 are also significantly more expensive than the much slower DDR3 used by the rest of the computer.Reply

Don't forget that in a graphics card you're getting a larger chip with more processing power, a board for it to run on, AND memory. 1GB+ of ultra fast memory and the tech to get it to work with the GPU is not cheap.

So your question needs to factory in cpu+mobo+memory, and even then it does not have the capabilities to process graphics at the needed rate.

Generic processing that is slower at certain tasks will always be cheaper than specialized, faster processing that excels at said task.Reply

High end graphics cards were always very expensive. They're for enthusiasts, not the majority of the market.I think prices have come down for the majority of consumers. Mostly thanks to AMDs moves, budget cards are now highly competitive, and offer acceptable performance in most games with acceptable quality. I think the high end cards just aren't as necessary as they were 'back in the day', but then, maybe I just don't play games as much as I used to. To me, it was always the case that you'd be paying an arm and a leg to have an upper tier card, and that hasn't changed.Reply

I don't understand why so many cards still cling to DVI. Seeing that Nvidia is at least including native HDMI on their recent generations of cards is nice, but why, in 2010, on an enthusiast-level graphics card, are they not pushing the envelope with newer standards?

The fact that AMD includes DVI, HDMI, and DisplayPort natively on their newer lines of cards is probably what's going to sway my purchasing decision this holiday season. Something about having all of these small, elegant, plug-in connectors and then one massive screw-in connector just irks me.Reply

Aside from apple almost noone uses DP. Assuming it wasn't too late in the life cycle to do so, I suspect that the new GPU used in the 6xx series of cards next year will have DP support so nvidia can offer many display gaming on a single card, but only because a single DP clockgen (shared by all DP displays) is cheaper to add than 4 more legacy clockgens (one needed per VGA/DVI/HDMI display).Reply

Market penetration starts by companies supporting the "cutting edge" of technology. DisplayPort has a number of advantages over DVI, most of which would be beneficial to Nvidia in the long run, especially considering the fact that they're pushing the multi-monitor / combined resolution envelope just like AMD.

Perhaps if you only hold on to a graphics card for 12-18 months, or keep a monitor for many years before finally retiring it, the connectors your new $300 piece of technology provides won't matter to you. If you're like me and tend to keep a card for 2+ years while jumping on great monitor deals every few years as they come up, it's a different ballgame. I've had DisplayPort-capable monitors for about 2 years now.Reply

I invested just under $1000 in a 30" professional 8-bit PVA LCD back in 2006 that is still better than 98% of the crappy 6-bit TN panels on the market. It has been used with 4 different video cards, supports DVI, VGA, Component HD and Composite SD. Has an ultra wide color gamut (113%), great contrast, matt screen with super deep blacks and perfectly uniform backlighting along with mem card readers and USB ports.

Display Port, not any other monitor on the market offers me absolutely nothing new or better in terms of visual quality or features.

If you honestly see an improvement in quality spending $300 ever 18 months on a new "value" displays then I feel sorry for you, you've made some poorly informed choices and wasted a lot of money.Reply

It is somewhat disappointing. People with existing screens probably don't care, and the cheap TN screens still pimp the DVI interface, but all of the high end IPS panel displays include either HDMI, DP or both. Why wouldn't a high end video card have the matching outputs?Reply

Input lag depends on the screen's controller, you're thinking pixel response time. Yes, TN is certainly faster then IPS for that. I still wouldn't get a TN though, the IPS isn't far enough behind in response time to negate the picture quality improvement.Reply

Due to the rarity of HDMI 1.4 devices (needed to go above 1920x1200) replacing a DVI port with an HDMI port would result in a loss of capability. This is aggravated by the fact that due to their stickerprice 30" monitors have a much longer lifetime than 1080p displays and owners who would get even more outraged as being told they had to replace their screens to use a new GPU. MiniDVI isn't an option either because it's singlelink and has the same 1920x1200 cap as HDMI 1.3.

Unfortunately there isn't room for anything except a single miniHDMI/miniDP port to the side of 2 DVI's, installing it on the top half of a double height card like ATI has done cuts into the cards exhaust airflow and hurts cooling. With the 5xx series still limited to 2 outputs that's not a good tradeoff, and HDMI is much more ubiquitous.

The fiasco with DP-DVI adapters and the 5xxx series cards doesn't exactly make them an appealing option either to consumers.Reply

That makes good sense too, you certainty wouldn't want to drop an existing port to add DP. I guess it really comes down to that cooling vs port selection problem.

I wonder why ATI stacked the DVI ports? Those are the largest ports out of the three and so block the most ventilation. If you could stack a mini-DP over the mini HDMI, it would be a pretty small penalty. It might even be possible to mount the mini ports on edge instead of horizontally to keep them all on one slot.Reply

"...Whereas the GTX 580 took a two-tiered approach on raising the bar on GPU performance while simultaneously reducing power consumption, the GeForce GTX 470 takes a much more single-tracked approach. It is for all intents and purposes the new GTX 480, offering gaming performance..."Reply

Seriously? We're still going to preach on this topic? I was one of those in disagreement with the way they handled the launch of the AMD 68XX series cards, but let it die already. This is a LAUNCH article and it deals with the design of the card and the performance of the reference card. As such it should not contain comparisons to OC'd cards.....not AMD nor NVIDIA. In a follow-up article, however, it should be compared to non-reference designs from both camps.

If, when the AMD 69XX series cards come out and they include OC'd Nvidia cards, THEN you can rant and rave. But I can guarantee you there is no way they would do that after the fallout of the previous launch.

I was just wondering why there are no starcraft2 performance figures in the review.Understandably there is no "benchmark" feature implemented in the game and they are annoying and time consuming to run and of course the card can handle it. But it is the only game some of us play and the figures may help guide us to see if it's "worth it".Reply

Any idea why the 580 sli takes such a huge dump going from 1920 res to the 2560 res. It loses half its framerate! I has 1.5 gigs of memory vs the 5870 1 gig and the 5870 crossfire goes from 50 fps at 1920 and 37 at 2560. The 580 sli goes from 72 fps at 1920 to 36 at 2560.

It seems that AMD is finally getting cross-fire scaling well. The new 68xx cars are better than the old, but the 5870 is scaling as well as the Nvidia cards in a lot of cases. My guess is that with cross-fire or SLI the memory bandwidth is less of an issue. You don't fully double your framerate afterall. It is likely more dependant on the GPU clock speed..which is an advantage for AMD.

I am really just taking a guess here. The other option is that it is simply an immature driver and will be fixed later.Reply

Only when you used a dual-AMD-card configuration you will realize how much you will suffer from its poor drivers. It's fast but buggy and I've been waiting too long for AMD to finally come up with a Catalyst that at least runs as stable as the nVidia driver. So please AMD, give us a nice driver! Reply

Hey,Good review overall for an apples to apples comparison. I would have liked to see what it did overclocked as some have mentioned. On the Metro 2033 page the article says the following:

"While Metro was an outstanding game for the GTX 580 to show off its performance advantage, the situation is quite different for the GTX 470. Here it once again fulfills its role as a GTX 480 replacement, but it’s far more mortal when it comes to being compared to other cards. "

In the first sentence shouldn't it be "...the situation is quite different for the GTX570." and not the 470?Reply

Much as I love this site, the color schemes for the charts is really getting old. Why can't all the colors be the same EXCEPT for the one being reviewed. We're mostly all adults and can read so the other GPU's in the charts could be left all one color.

Some other sites do this and it is much easier to read what is actually being reviewed, even if the review color is always the same on each chart. It still adds to the clutter of the charts. The human eye/brain gets distracted easy.

The colors are still a work in progress. We had some requests for additional colors in GPU articles to highlight the products we're immediately comparing the reviewed product to, which is what I did for this article. Certainly if you guys this this is too much, we can go back to fewer colors.Reply

I really like this idea. All one color for 'set' of reviews (if multiple), and one color for primary.

BTW, I didn't know others were asking for more colors. I guess do what others want. For me, personally, I like the one color for primary and one color for all others. It is just the easiest for 'first glance' to be easily distinguishable.

I dont know why I should go for the Nvidia GTX 580 / 570 series when I am getting the same (almost or more than) performance with ATI Radeon cards for a lower price. ATI HD 5970 is almost 30$ cheaper than GTX 580 but outperforms it in every single test. 5870 is not very close but atleast some what close and the performance of GTX 570 over 5870 does not justify a $100 gap between these two. Anyways, I think NVIDIA is just producing cards for name sake..with HD6900 series coming up, I will not be surprised if they offer huge performance leap over the GTX 580/570 for the same price...Again it will be what NVIDIA was when ATI released their batch of first DX11 cards and NVIDIA was struggling hard to get an answer to those...Reply

Ryan Smith from his conclusion"As with the GTX 580 we’d pick the simplicity of a single-GPU setup over the potential performance advantages of a multi-GPU setup, but this is as always a personal decision."

Going with dual GPUs (specifically nvidia) has it advantages. You get to experience nvidia 3D surround. Yeah I know the additional costs the additional monitors etc that this entails. If GTX 460 1GB SLI can bring so much to the table, I can only imagine what the GTX 560 1 GB SLI can do when it comes.

I only wish the development on the display side catches up with the development on the GPU side (now that AMD has only jumped on the 3D bandwagon).Reply

------------The GTX 570 is fast enough to justify its position and the high-end card price premium, but at $100 over the GTX 470 and Radeon HD 5870 you’re paying a lot for that additional 20-25% in performance.-----------