NVIDIA's next high-end graphics card, the GeForce "Titan" 780, is shaping up to be a dreadnought of sorts. It reportedly ships with 6 GB of GDDR5 memory as its standard amount. It's known from GK110 block diagrams released alongside the Tesla K20X GPU compute accelerator, that the chip features a 384-bit wide memory interface. With 4 Gbit memory chips still eluding the mainstream, it's quite likely that NVIDIA could cram twenty four 2 Gbit chips to total up 6,144 MB, and hence the chips could be spread on either sides of the PCB, and the back-plate could make a comeback on NVIDIA's single-GPU lineup.

On its Radeon HD 7900 series single-GPU graphics cards based on the "Tahiti" silicon (which features the same memory bus width), AMD used 3 GB as the standard amount; while 2 GB is standard for the GeForce GTX 680; although non-reference design 4 GB and 6 GB variants of the GTX 680 and HD 7970, respectively, are quite common. SweClockers also learned that NVIDIA preparing to price the new card in the neighborhood of $899.

899$ for it but if my calculations are right it should consume around 300w which I wonder how their gonna cool. Most likely the clocks will be pulled down to 800mhz so that the TDP drops but then performance should be about 39% greater than that of a gtx 680 at which point the 80% price increase is stupid. If it has unlocked voltage I would buy it but most likely this will end the same way as the 600 series with Boost and locked volts.
So I will stick with AMD.

I beg to differ. If this truly is a high end card, then it needs all the memory it can get. This card is expected to be used by people with 4K and greater resolutions. When I run Skyrim at 5760x1200 with high resolution textures I can can easily use 2.5 GiB of graphics memory. I only expect that to become the norm with new generation consoles and therefore more detailed games coming out.

I beg to differ. If this truly is a high end card, then it needs all the memory it can get. This card is expected to be used by people with 4K and greater resolutions. When I run Skyrim at 5760x1200 with high resolution textures I can can easily use 2.5 GiB of graphics memory. I only expect that to become the norm with new generation consoles and therefore more detailed games coming out.

Click to expand...

You are right. I was thinking of 1080p. If one has a card like this I hope they have higher than 1080p.

On another note, I think that this would be the most memory chips ever on a reference single GPU graphics card at 24. GTX 295 still holds the reference card crown at 28, and the ASUS ARES III would hold the overall record at 32 if it was released.

899$ for it but if my calculations are right it should consume around 300w which I wonder how their gonna cool. Most likely the clocks will be pulled down to 800mhz so that the TDP drops but then performance should be about 39% greater than that of a gtx 680 at which point the 80% price increase is stupid. If it has unlocked voltage I would buy it but most likely this will end the same way as the 600 series with Boost and locked volts.
So I will stick with AMD.

Click to expand...

300W of heat isn't such a problem IMO, look at all the dual GPU's that are rated at 375W (and go way beyond that when overclocked).

Still more than happy with my GTX 680 at this point even if this turn out to be a monster. I'm only gaming at 1080p anyway so all that amount of ram would be useless to me.

Last card I had was a GTX 560 Ti, before that HD 5870 - i.e gaming cards. GTX 680 = gaming card. GK110 just doesnt make sense to me as a gaming card. Like GTX 480/580 this would obviously consume much more power. For HPC yes these gpu will do a job but for gaming I think its not the best / optimum solution.

Yes I understand people want the best of the best but at that alleged price point its ridiculous. However thats not to say I'm not excited about this. Unless AMD does miracles with GCN then performance crown would almost certainly be, without arguments, with Nvidia. However if AMD's pricing is very competitive, combined with great gaming bundles then who knows how the market will turn out. YAY gpu wars.