The Kepler reviews are now posted by all the major review sites. I noticed that Newegg even has a few PNY models in stock ($499 USD).

It seems Nvidia has a winner on their hands. Not only does the 680 trump the 7970 in most situations, it does so while being largely more efficient. Particularly, the 680's shader performance is insanely impressive, which allows it to match the dual-GPU GTX 590 in games like BF3.

New features include adaptive vsync and the ability to run games in surround with a single GTX 680.

I have to say, Nvidia has done a great job with GK104. Yeah, it's not the beast we'd expect from Nvidia at launch, but it still easily takes the single-GPU crown.

The Kepler reviews are now posted by all the major review sites. I noticed that Newegg even has a few PNY models in stock ($499 USD).

It seems Nvidia has a winner on their hands. Not only does the 680 trump the 7970 in most situations, it does so while being largely more efficient. Particularly, the 680's shader performance is insanely impressive, which allows it to match the dual-GPU GTX 590 in games like BF3.

New features include adaptive vsync and the ability to run games in surround with a single GTX 680.

I have to say, Nvidia has done a great job with GK104. Yeah, it's not the beast we'd expect from Nvidia at launch, but it still easily takes the single-GPU crown.

Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...

[list]
[*]Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.
[*]In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.
[*]Things carry on happily this way.
[*]Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.
[*]AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).
[*]GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.
[*]Speculation ensues.
[/list]

Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />

Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:

[quote]Gamers told us they want GPUs that are cooler, quieter, and more power efficient. So we re-designed the architecture to do just that. The GeForce GTX 680 consumes less power than any flagship GPU since the GeForce 8800 Ultra, yet it outperforms every GPU we or anyone else have ever built.[/quote]

And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.

Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...

Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.

In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.

Things carry on happily this way.

Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.

AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).

GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.

Speculation ensues.

Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />

Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:

Gamers told us they want GPUs that are cooler, quieter, and more power efficient. So we re-designed the architecture to do just that. The GeForce GTX 680 consumes less power than any flagship GPU since the GeForce 8800 Ultra, yet it outperforms every GPU we or anyone else have ever built.

And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.

[quote name='jimbonbon' date='22 March 2012 - 04:12 PM' timestamp='1332450742' post='1386476']
Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...

[list]
[*]Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.
[*]In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.
[*]Things carry on happily this way.
[*]Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.
[*]AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).
[*]GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.
[*]Speculation ensues.
[/list]

Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />

Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:

And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.
[/quote]

Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...

Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.

In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.

Things carry on happily this way.

Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.

AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).

GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.

Speculation ensues.

Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />

Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:

And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.

I agree wholeheartedly Jimbonbon. I truly don't understand what the naysayers are complaining about. The GTX 680 trumps the dual-GPU GTX 590 in games like BF3, while consuming around 190W of juice. In addition to that, the new features the 600 series brings to the table are most welcomed. I think this is a GPU we should all be excited about.

I'm happy with the direction Nvidia is going here. They've developed a lean GPU that's wicked fast. What's not to like? I'll admit that I would have liked to see the price a bit lower, but considering the competition I think it's fair.

I agree wholeheartedly Jimbonbon. I truly don't understand what the naysayers are complaining about. The GTX 680 trumps the dual-GPU GTX 590 in games like BF3, while consuming around 190W of juice. In addition to that, the new features the 600 series brings to the table are most welcomed. I think this is a GPU we should all be excited about.

I'm happy with the direction Nvidia is going here. They've developed a lean GPU that's wicked fast. What's not to like? I'll admit that I would have liked to see the price a bit lower, but considering the competition I think it's fair.

nicely said jimbonbon. yes i always didn't like to add anything to the speculation thread that was so overdue. i even hardly looked at it. sort of a waste of time really posting on speculation and worse arguing about something that we didn't know about.
that being said i still stick to my thought and that is before GTX680 release, the HD7970 would have been the ideal choice over what was available at the time of its release without question. i would have recommended the HD7970 to anyone without hesitation.
but now things are nicer /tongue.gif' class='bbc_emoticon' alt=':tongue:' />

nicely said jimbonbon. yes i always didn't like to add anything to the speculation thread that was so overdue. i even hardly looked at it. sort of a waste of time really posting on speculation and worse arguing about something that we didn't know about.

that being said i still stick to my thought and that is before GTX680 release, the HD7970 would have been the ideal choice over what was available at the time of its release without question. i would have recommended the HD7970 to anyone without hesitation.

but now things are nicer /tongue.gif' class='bbc_emoticon' alt=':tongue:' />

I hate prices in NZ though.
I assume it will be all over the world, right?
Here it will launch at 999NZD
converting the 499USD to NZD I get [url="http://www.xe.com/ucc/convert/?Amount=499&From=USD&To=NZD"]this[/url].

So if I buy it from US and pay for shipping to get it sent to me I still pay far less than buying it from here in NZ.

[quote name='chiz' date='23 March 2012 - 03:28 AM' timestamp='1332469729' post='1386630']
Nvidia reps should really consider the ramifications of treating their customers as if they're uninformed or ignorant. In the end though, it just makes you look dishonest, is that the image you really want to put out there? Better to just own up to it. LilK performed better than expected, Tahiti sucked, and BigK isn't quite ready and is yet to be released, so we get a mid-range part selling at $500 flagship prices.

There's about 100 bits of evidence that all point to GK104 initially targeting mid-range performance SKUs but bumped up to prime time as a result of Tahiti's poor showing. Here's a nice summary, many of the breadcrumbs are provided by Nvidia, they just didn't do a very good job of sweeping the evidence under the rug. But I guess that's what happens when you spend 2+ years developing a product line and then decide to change gears in the 24th hour before launch.

http://www.techpowerup.com/forums/showthread.php?t=162901

That said, the only thing not to like about lilKepler is its pricing. $400-$450 would've made a lot more sense and been a better reflection of its performance improvement over last-gen's GTX 580, but at the very least, it provides a glimpse of what is to come with BigK. Should be great, will be nice to see what Nvidia's flagship is capable of if their mid-range performs this well! /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />
[/quote]

BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.

What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.

Nvidia reps should really consider the ramifications of treating their customers as if they're uninformed or ignorant. In the end though, it just makes you look dishonest, is that the image you really want to put out there? Better to just own up to it. LilK performed better than expected, Tahiti sucked, and BigK isn't quite ready and is yet to be released, so we get a mid-range part selling at $500 flagship prices.

There's about 100 bits of evidence that all point to GK104 initially targeting mid-range performance SKUs but bumped up to prime time as a result of Tahiti's poor showing. Here's a nice summary, many of the breadcrumbs are provided by Nvidia, they just didn't do a very good job of sweeping the evidence under the rug. But I guess that's what happens when you spend 2+ years developing a product line and then decide to change gears in the 24th hour before launch.

http://www.techpowerup.com/forums/showthread.php?t=162901

That said, the only thing not to like about lilKepler is its pricing. $400-$450 would've made a lot more sense and been a better reflection of its performance improvement over last-gen's GTX 580, but at the very least, it provides a glimpse of what is to come with BigK. Should be great, will be nice to see what Nvidia's flagship is capable of if their mid-range performs this well! /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />

BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.

What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.

[quote name='slamscaper' date='22 March 2012 - 10:47 PM' timestamp='1332470864' post='1386636']
BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.

What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.
[/quote]
I'm sure BigK's release schedule has been impacted as a result of Tahiti's poor showing without a doubt. Regardless of when it could've or would've been ready, its undoubtedly going to come out later than intended because Tahiti sucks and lilK is now a GTX 680.

As for performance, there's no doubt the 680 provides exceptional performance for a mid-range part, but when held to the same standards as previous next-gen flagship SKUs on a new process node in this flagship price range, its the worst price:performance shift since the 9800GTX. You can map out the increases yourself, GTX 680 is ~35-40% faster than the GTX 580. Compare GTX 480 to GTX 285, or GTX 280 to 8800GTX and you'll see exactly what I'm talking about.

People don't seem to be considering the long-term ramifications here though. GK104 mid-range selling as high-end sets a terrible precedent and allows Nvidia to continue selling mid-range ASICs at high-end prices as long as they meet the (ever-decreasing) standards and expectations of how much of an improvement makes upgrading worthwhile. It seems nowadays, +35-40% is enough of an increase over 16-24 months to command the same price premiums when its always been +50% or more in the past. Bottomline is this, you get less for your money in the way of improvement and innovation than you did in the past. Shame really, Moore's Law is dead in the GPU space now as well (I guess JHH will have to take that bulletpoint off his next slide-deck).

BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.

What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.

I'm sure BigK's release schedule has been impacted as a result of Tahiti's poor showing without a doubt. Regardless of when it could've or would've been ready, its undoubtedly going to come out later than intended because Tahiti sucks and lilK is now a GTX 680.

As for performance, there's no doubt the 680 provides exceptional performance for a mid-range part, but when held to the same standards as previous next-gen flagship SKUs on a new process node in this flagship price range, its the worst price:performance shift since the 9800GTX. You can map out the increases yourself, GTX 680 is ~35-40% faster than the GTX 580. Compare GTX 480 to GTX 285, or GTX 280 to 8800GTX and you'll see exactly what I'm talking about.

People don't seem to be considering the long-term ramifications here though. GK104 mid-range selling as high-end sets a terrible precedent and allows Nvidia to continue selling mid-range ASICs at high-end prices as long as they meet the (ever-decreasing) standards and expectations of how much of an improvement makes upgrading worthwhile. It seems nowadays, +35-40% is enough of an increase over 16-24 months to command the same price premiums when its always been +50% or more in the past. Bottomline is this, you get less for your money in the way of improvement and innovation than you did in the past. Shame really, Moore's Law is dead in the GPU space now as well (I guess JHH will have to take that bulletpoint off his next slide-deck).

Indeed it would be pointless to opt for the GTX 580's at this point ... even considering the current prices , GTX680's sold out , not surprising though , will see how the drivers work here in the near future anyways .

Indeed it would be pointless to opt for the GTX 580's at this point ... even considering the current prices , GTX680's sold out , not surprising though , will see how the drivers work here in the near future anyways .

[quote name='Tepescovir' date='22 March 2012 - 12:11 PM' timestamp='1332439911' post='1386390']
only benchmarks i'm intrested in is with nvidia surround. My 580's will already max out every game out there on a single display. But i want to play bf3 in surround which 3 580's cant do
[/quote]

I want to see Surround #s on 2 680s myself. Specifically in comparison to 580s & 7970s, because the results I found show the 7970s beat 580s by 45% at 1080px3 (5760x1080p), on BF3 with Ultra Settings, HBAO & no AA, with Post Process AA on High. I'd love to see how the 680s do on this, as it's a BIG selling point for me since I run that resolution & would really like to run this game maxed out or nearly maxed out (since no AA doesn't count as maxed out).

So if/when anyone gets two of these things if you have Surround & BF3, PLEASE post a run through on a map or something.

I'm going to do the same when I get my next set of cards, and I'll gladly do it with 580s as well, as long as I have them at least.

only benchmarks i'm intrested in is with nvidia surround. My 580's will already max out every game out there on a single display. But i want to play bf3 in surround which 3 580's cant do

I want to see Surround #s on 2 680s myself. Specifically in comparison to 580s & 7970s, because the results I found show the 7970s beat 580s by 45% at 1080px3 (5760x1080p), on BF3 with Ultra Settings, HBAO & no AA, with Post Process AA on High. I'd love to see how the 680s do on this, as it's a BIG selling point for me since I run that resolution & would really like to run this game maxed out or nearly maxed out (since no AA doesn't count as maxed out).

So if/when anyone gets two of these things if you have Surround & BF3, PLEASE post a run through on a map or something.

I'm going to do the same when I get my next set of cards, and I'll gladly do it with 580s as well, as long as I have them at least.