"The GK110 won't reach the market until the fourth quarter of 2012, and multiple folks from Nvidia forthrightly admitted to us that those chips are already sold out through the end of 2012. All of those sales are to supercomputing clusters and the like, where each chip commands a higher price than it would aboard a video card."

And that's assuming everything goes well. Based on Nvidia's recent history, it probably won't. With more than double the transistors of GK104, GK110 will likely be Nvidia's largest chip ever by die size. The last huge die that Nvidia was able to bring to market without massive delays was G80. For those keeping score, that was in 2006.

And then once Nvidia does have dies, what will they do with them? Tesla cards at $4000 each? Quadro cards at $4000 each? GeForce cards at $800 each? The last option gets nothing (or perhaps token quantities for a paper launch) until the first two are satisfied.

The good news is that GK110 will be heavily based on the established Kepler architecture, and built on what will by then be a mature process node. So two of the major factors that fouled the launches of GK10* and GF10* will be absent. But 600+ mm^2 dies are still hard to make if you want them to work properly, which is why AMD declines to even try.

Still, once Nvidia gets the cards working, GK110-based GeForce cards will be the undisputed king of graphical performance. Ever since AMD went to the small die strategy four generations ago, their plan was to cede the top end to Nvidia, while building the best $100 and $200 and $300 cards that they could. The problem for Nvidia is that they haven't been able to fill that top end properly since G80. When your die is 70% bigger than your competitor's, you really need to win in performance by more than 20% or so.

Now that Nvidia finally has a basic architecture competitive with AMD's, they have a chance at having a top end card that is by far the best. A single GK110-based card has a serious shot at being better than two of AMD's top end cards in CrossFire, at least if you ignore average frame rates and consider actual gaming experience, where CrossFire and SLI aren't nearly as good as you'd expect from the average frame rates.

But that's only if Nvidia can deliver. Can they? AMD has had their entire Southern Islands lineup available for more than two months now, while Nvidia has had nothing but paper launches of Kepler until some GeForce GTX 670s started showing up recently. GK110 will probably be up against the top end GPU from AMD's next generation, not Tahiti, but with it unlikely that AMD will either go with an enormous die or find miraculous RV770-style efficiency improvements, GK110 will probably win the top end, and probably by a lot. Delays won't change that unless GK110 barely beats AMD's 20/22 nm GPUs to market, and that's probably not coming until 2014.

Report this post

Originally posted by lizardbones Well, I'm kind of waiting for DDR4 to come out so I should be fine. :-)

That's not relevant to video cards. Also, you should probably be thinking 2014 for that.

I can't even fake my way through an involved financial discussion about why it's relevant to me. I'm just not upgrading for awhile and my money and DDR4 memory will come into existence at roughly the same time. It's simply a happy coincidence that a decent performing card could be a year old at the same time, making things a nice, neat little package when I upgrade.

Report this post

Originally posted by QuizzicalBy the time DDR4 memory is ready, there will probably be 20 or 22 nm GPUs ready. You'll want a Radeon HD 9000 series or GeForce 800 series card then, not GK110.

If they are "new", they'll be out of my price range. I don't know how many people purchase video cards the way I do, but I seem to be much happier with my cards when I buy them in the $150-$200 price range. If those cards are in that price range, then that's exactly what I'd be looking to buy. I could be wrong, but I think with a year or so of time on it, the GK110 would be near that price range.

It's a little silly for me to plan that far ahead though...I think the only thing that will happen for sure by then is that DDR4 will be out, it will use less power and it'll be cheaper to buy relative to DDR3 memory. I'll probably have to start all my research on everything else over from scratch. I'll probably be reading Quizzical threads on MMORPG.com. :-)

Report this post

Explain why you are reporting this post:(750 characters max.)

Believe by the time DDR4 is here so will Maxwell (2014 Q1/Q2) the newest Nvidia cards. I'm building a new rig myself soon, other one died. I'll be getting an GTX 670 untill Maxell, if I'll even need it by then.

Report this post

I don't have money for it myself, but with it out - it would push current gk104 and it's direct succesors into mid-end range where they belong making it more affordable.

Cause atm GPU's are too expensive for me. 7xxx and 6xx series released with totally absurd prices from my consumer point of view.

Good thing that they just recently started to get cheaper (AMD's cutting prices on 7xxx series).

I actually might afford 660-670 or 78xx-7950 card and on par CPU, mobo, monitor, etc in few months in this trend will continue.

Currently have laptop and want to change to desktop for gaming needs.

Don't want to buy low-end or even lower level of mid-end, but cannot afford current higher levels of mid-end or lower levels of high-end with current prices and economy especially that this economy and my personal income is so uncertain there days.

Report this post

Originally posted by QuizzicalBy the time DDR4 memory is ready, there will probably be 20 or 22 nm GPUs ready. You'll want a Radeon HD 9000 series or GeForce 800 series card then, not GK110.

If they are "new", they'll be out of my price range. I don't know how many people purchase video cards the way I do, but I seem to be much happier with my cards when I buy them in the $150-$200 price range. If those cards are in that price range, then that's exactly what I'd be looking to buy. I could be wrong, but I think with a year or so of time on it, the GK110 would be near that price range.

It's a little silly for me to plan that far ahead though...I think the only thing that will happen for sure by then is that DDR4 will be out, it will use less power and it'll be cheaper to buy relative to DDR3 memory. I'll probably have to start all my research on everything else over from scratch. I'll probably be reading Quizzical threads on MMORPG.com. :-)

You have the situation very backwards. Newer GPUs that are die shrinks will soon be cheaper than older GPUs that give the same performance. GK110 will probably be around 600 mm^2. That's a huge die, and tremendously expensive to build. If Nvidia sells cards based on that die for $200, then they'll probably be losing money on every card sold.

If you do a die shrink to a 20 nm process node, then you can get the same number of transistors in about 300 mm^2. That's still a fairly big die, but it will be a lot more affordable. Do another die shrink to a 14 nm process node and now you can get the same number of transistors in about 150 mm^2. Now you have a GPU that you can readily put in $200 cards and still make a handsome profit, while giving the same performance as GK110 and only using half as much power.

-----

Conversely, the transition from DDR3 to DDR4 won't be a money-saver, or at least not at first. What saves money on memory is that you can do a die shrink to make chips with double the capacity of the previous generation for the same price. But you initially do that with both DDR3 and DDR4, so DDR4 doesn't have a cost advantage. Eventually you stop doing die shrinks with DDR3 and just keep producing it on old process nodes until you stop producing it entirely, as there isn't enough volume to justify the cost of doing a die shrink with it. That's why DDR2 hasn't gotten any cheaper in recently years.

But at first, DDR4 is going to be a low volume product, and DDR3 will still get most of the volume. So DDR4 will probably be more expensive at first. Eventually most memory will shift to DDR4 rather than DDR3, and then DDR4 will keep getting cheaper, while DDR3 doesn't. But that will take a while.

Report this post

I don't have money for it myself, but with it out - it would push current gk104 and it's direct succesors into mid-end range where they belong making it more affordable.

Cause atm GPU's are too expensive for me. 7xxx and 6xx series released with totally absurd prices from my consumer point of view.

Good thing that they just recently started to get cheaper (AMD's cutting prices on 7xxx series).

I actually might afford 660-670 or 78xx-7950 card and on par CPU, mobo, monitor, etc in few months in this trend will continue.

Currently have laptop and want to change to desktop for gaming needs.

Don't want to buy low-end or even lower level of mid-end, but cannot afford current higher levels of mid-end or lower levels of high-end with current prices and economy especially that this economy and my personal income is so uncertain there days.

I don't see what's so absurd about new generation prices. All of the Radeon HD 7700 series cards on New Egg are $150 or less before shipping and rebates. GK107 cards will probably be cheaper yet once they come to desktops. Higher end cards cost more, but higher end cards always cost more. If you were hoping to pick up a $50 card, then both AMD and Nvidia aren't making that this generation because that level of performance is handled by integrated graphics.

And don't count on GK110 to shove prices downward. It's going to be a huge, expensive die. If you want to predict how Nvidia will price it, then look at how Nvidia priced it the last time they had a huge die that was as good as it should have been. The GeForce 8800 Ultra was $800, and the next bin down GeForce 8800 GTX was $600. That's not going to put pricing pressure on $400 cards, let alone $200 cards.

Report this post

Explain why you are reporting this post:(750 characters max.)

Originally posted by Quizzical

I don't see what's so absurd about new generation prices. All of the Radeon HD 7700 series cards on New Egg are $150 or less before shipping and rebates. GK107 cards will probably be cheaper yet once they come to desktops. Higher end cards cost more, but higher end cards always cost more. If you were hoping to pick up a $50 card, then both AMD and Nvidia aren't making that this generation because that level of performance is handled by integrated graphics.

And don't count on GK110 to shove prices downward. It's going to be a huge, expensive die. If you want to predict how Nvidia will price it, then look at how Nvidia priced it the last time they had a huge die that was as good as it should have been. The GeForce 8800 Ultra was $800, and the next bin down GeForce 8800 GTX was $600. That's not going to put pricing pressure on $400 cards, let alone $200 cards.

While I respect your knowegedle in hardware, don't put words in my mouth please. I stated what level of cards I am interested in quite clearly.

As for pricing of cards, 500$+ for current small die high-end cards are too much imho.

Except that they're not small dies. Tahiti is 365 mm^2, and not that far shy of the largest die AMD has ever done (R600 was 420 mm^2). Tahiti also has six memory channels, which is only the second time that AMD has ever gone over four. The reason why it's an expensive card to buy is that it's an expensive card to build.

Nvidia is more accustomed to going with enormous dies. But when is the last time that Nvidia went under $500 for their top end card for any reason other than price pressure from AMD, especially when AMD was a generation ahead of Nvidia?

Will the GTX 680 and Radeon HD 7970 eventually drop to $400? Eventually, they probably will. Will it happen before they're discontinued and on clearance prices? Maybe. But GK110 won't be the thing that pushes them there.

Report this post

Whats with all the soft launches this year? Trinity launched with no products, Ivy Bridge is just starting to circulate, and the GTX680 is in low supply.

Launches of laptop products are always slow, as launch only means that laptop vendors have permission to start selling laptops with the products in question. It doesn't mean that they actually have laptops ready to sell.

Desktop Ivy Bridge and Southern Islands cards were hard launches. Nvidia has been doing paper launches for reasons known only to them.

Report this post

Explain why you are reporting this post:(750 characters max.)

Originally posted by Quizzical

Except that they're not small dies. Tahiti is 365 mm^2, and not that far shy of the largest die AMD has ever done (R600 was 420 mm^2). Tahiti also has six memory channels, which is only the second time that AMD has ever gone over four. The reason why it's an expensive card to buy is that it's an expensive card to build.

Nvidia is more accustomed to going with enormous dies. But when is the last time that Nvidia went under $500 for their top end card for any reason other than price pressure from AMD, especially when AMD was a generation ahead of Nvidia?

Will the GTX 680 and Radeon HD 7970 eventually drop to $400? Eventually, they probably will. Will it happen before they're discontinued and on clearance prices? Maybe. But GK110 won't be the thing that pushes them there.

Not sure I why I was under impression that Tahiti was smaller. Maybe I read it wrong when I read about it or it was error on webpage I was reading. Anyway - then it makes more sense.

Anyway - I am also more sensitive to prices because dollar as currency is getting 'stronger' lately.

Currency of my country in which partially due to Greece and euro situation is getting weaker (when there is turmoil on the market, investors always buy assets in 'safe' currencies like Dollar, Pound, Yen, Switzerland Frank,etc on expense of other currencies).

So that just add to price problem.

===============

Back strictly to topic : I am really curious of first benchmarks of GK110 and how / if AMD would want to respond to this chip, even if there is close to zero chance I will own it.

Report this post

Back strictly to topic : I am really curious of first benchmarks of GK110 and how / if AMD would want to respond to this chip, even if there is close to zero chance I will own it.

AMD isn't going to do anything to counter GK110. For the last several generations, AMD has been content to vacate the high end and build the best $100 and $200 and $300 cards that they can, because that's where you make the money on consumer GPUs. They've won the last few generations largely because of that strategy, as while Nvidia was focused on huge dies that they couldn't build, AMD was focused on smaller dies that they could.

Report this post

I don't have money for it myself, but with it out - it would push current gk104 and it's direct succesors into mid-end range where they belong making it more affordable.

Cause atm GPU's are too expensive for me. 7xxx and 6xx series released with totally absurd prices from my consumer point of view.

Good thing that they just recently started to get cheaper (AMD's cutting prices on 7xxx series).

I actually might afford 660-670 or 78xx-7950 card and on par CPU, mobo, monitor, etc in few months in this trend will continue.

Currently have laptop and want to change to desktop for gaming needs.

Don't want to buy low-end or even lower level of mid-end, but cannot afford current higher levels of mid-end or lower levels of high-end with current prices and economy especially that this economy and my personal income is so uncertain there days.

I have two GTX 460 SE 1024 that will do you plenty in games like Guild Wars 2. I'll sell them for a fair price. One is used, the other is brand new never been put in my system. I'll sell them both to you for what used ones go for. They overclock like a champ if you're interested in that..it's easy.