To pave the way for the GeForce GTX 680, which will arrive later this month in small but sizable quantities, with wide availability in the months to come, NVIDIA is cutting the prices of its GeForce GTX 580 graphics card. The GF110-based behemoth of 2011, will now start at 339.8 EUR, according to European price-aggregators such as Geizhals.at. The new price makes GeForce GTX 580 1.5 GB cheaper than the Radeon HD 7950, and having a slightly improved price-performance ratio. The 3 GB variants of GeForce GTX 580 are priced similar to the HD 7950. The GTX 570 starts at 249 EUR.

cant wait to see how this card performs and if this dynamic oc feature works well or if its a flop. i am eagerly awaiting nvs offerings coz i wanna give them a crack since i havent used a nv card since the 8800gts 320mb

Well, for 340 EUR i see it as a good competition even against current HD7000 lineup despite the fact its the older series. With absolute nonsense prices around HD7950 and HD7970 (both way over 400 EUR) this GTX 580 is a good option still.

What Nvidia would be wise to do, is release their high end Kepler products and go for the crown, and just try to die shrink the GTX 580 and 570 to go after the HD 7800 series.

G80 to G92 transition proved you could teach an old dog new tricks and that by shrinking a existing die, making minor tweaks, and upping the clock speeds you could make a good architecture last in the market for a number of years while your competitor struggled to scale new architectures from their high end downward and still make them competitive.

A more efficient GTX 580 would be a perfect card, knock it down to 28nm and rebalance the clock speeds accordingly and even without DX11 I bet they'll sell very well against the 7800 series. And a 28nm GTX 570/560 TI would really make a lot of Nvidia's cluttered midrange obsolete.

What Nvidia would be wise to do, is release their high end Kepler products and go for the crown, and just try to die shrink the GTX 580 and 570 to go after the HD 7800 series.

G80 to G92 transition proved you could teach an old dog new tricks and that by shrinking a existing die, making minor tweaks, and upping the clock speeds you could make a good architecture last in the market for a number of years while your competitor struggled to scale new architectures from their high end downward and still make them competitive.

A more efficient GTX 580 would be a perfect card, knock it down to 28nm and rebalance the clock speeds accordingly and even without DX11 I bet they'll sell very well against the 7800 series. And a 28nm GTX 570/560 TI would really make a lot of Nvidia's cluttered midrange obsolete.

Click to expand...

wow

and there i was thinking the 580 was a revision of the 480 and had dx11 :|

I also don't think it's as easy as just shrinking the pre-existing GTX580 to 28nm, it would take a hefty amount of reworking. Kepler is intented to replace Fermi as a similar but much more efficient (in terms of power and heat generation) iteration. I don't see them wasting time on reworking that design when they already have the improved version in production.

I also don't think it's as easy as just shrinking the pre-existing GTX580 to 28nm, it would take a hefty amount of reworking. Kepler is intented to replace Fermi as a similar but much more efficient (in terms of power and heat generation) iteration. I don't see them wasting time on reworking that design when they already have the improved version in production.

Click to expand...

It was never priced competitively. None of the highend cards have been for years. One company, usually nvidia, sets a price they want you to pay, and then the other company releases their cards in between those price points. There is no competition. It's price fixing without actual communication.

It was never priced competitively. None of the highend cards have been for years. One company, usually nvidia, sets a price they want you to pay, and then the other company releases their cards in between those price points. There is no competition. It's price fixing without actual communication.

Click to expand...

I'm not saying I agree with it, not in the slightest, and it is completely price fixing. At this rate Nvidia's cards will be priced to compete with AMD's, so if the GTX680 is faster than the HD7970, it will probably launch at nearly $600.

and there i was thinking the 580 was a revision of the 480 and had dx11 :|

flushing the channels before a new launch, what a shocker :lol:

Click to expand...

I just confused my terms as I intended to write D3D11.1 but wrote DX11 since DirectX is what we are always used to talking about not Direct3D. HD 7900 series launched as the first with Direct3D 11.1 support, which is one of the major changes of Windows 8 and one of the larger selling points of this next generation of GPUs. Obviously Kepler will be Direct3D 11.1 as well, where Fermi was not - Fermi was Direct3D 11. But there you have it, a mistake, good thing I'm not a brain surgeon.

$4 cheaper, 7950 still faster on the most recent games (exc. skyrim) and consume 80-150W less than 580 on peak & maximum. but if you love your card, then what can everyone say?

Click to expand...

Maximum?! That can only be reached while stress testing with Furmark. Which gamer does that!

The real ones are average and peak values, at 88W and 85W respectively. Still a big difference but not that bad considering that we are comparing an old 40nm GPU to a 28nm one. All in all, 7950 looks to be a better choice, unless someone wants things like CUDA and/or physx.

It really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.

It really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.

Click to expand...

To me this price-cut does sound like it's going to be significantly cheaper than 500, say 350-400, or significantly faster than most of us expect now, because there would be no rush to lower GTX580 price to 330 otherwise. If they were going to sell it for 500, and it is 25% faster than GTX580, there would still be a place for GTX580 at 400 or so, no need to go as low as 330 and 250 for the GTX570. They would be making the new offering look very overpriced and it's forcing AMD to lower prices too, BEFORE GTX680 launches which is shooting themselves in the foot, because it's the GTX680 that needs the fame, not the EOL'd card.

It really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.

Click to expand...

$499 for the GTX680? Lol, that would be a bargain. I've heard prices are going to be a bit more than that.

It really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.

Click to expand...

It doesn't suck; no one is making you purchase it, if you can't stomach the price there will be some that do. So, Nvidia isn't pressured to release a high end part, while trying to implement some new wiz-bang feature on lower chips to achieve competitiveness… and that’s somehow AMD's fault.

So in your mind back a half-a-year ago or more Nvidia knew "with certainty" the reference 7970’s would bring about 12-18% improvement over the GTX 580, so with that they shelved the "top dog" embarking to make the GK-104 competitive. Hinging on if they can successfully implement some untried feature to boost the performance as need.

Nvidia was able to even if already a "think tank" project; completely evaluate, implement and tested all the parameters to make Dynamic profiles features function flawlessly in those 6 months? A tall order, but plausible.

You deduce AMD placed the bar too low, so Nvidia's running up to it, and figures ah! what the heck watch this… Rolls it shoulders back and limbos under the bar wowing the faithful! They throw arm up in victory saying we got to the other side; though with new rules ok? Wait for their next attempt.

So A GK-100 is such the virtuoso in performance, power, and price Nvidia couldn't bring themselves to market it?

Don’t get me wrong if it works and they planned it this way (even as a contingency) when first setting out in developing Kepler… Kudos for them, and good game changer.

While if they had apathy 6 months back and changed their game plan based on a speculation of performance, wow they risked a lot on what might or should've been at least shown previously in a trail product.