Only way I see this card having revelancy to such a name is... if it's 4K-proof to AT LEAST a handful of modern titles... since I hate multi-monitor gaming setups with a passsion and even the puny-ish 2560x1440/1600 are no way near as common as they should be... Nor is the hardware or programing skill of most game studios up to snuff.

Click to expand...

I'm with you on all that, but be realistic.

It's conceivable at the very edge this gen, but the big push will be on 20nm both because of the timing of the displays reaching a more tangible market as well as process abilities of tsmc (not to mention the potential need for more bw and/or more dense buffers without building monstrosities that will likely come between now and then).

Figure 4k is 4x 1080p.

I figure 20nm will bring similar designs to gk110/8900 aimed at the sweet-spot market with their shiny new 4k displays in late 2014 to 2015. That is to say efficient and 48 ROPs...obviously on more realistic size/yielding silicon and in consumer-friendly power envelopes. If that were roughly 2688 units (12 nvidia smx = 2688 w sfu, amd 42 cu = 2688) at ~1300mhz, it would be ~4x something like a 7850 (1024 x 860mhz), the baseline to play most titles at 1080p, and likely not changing much given the new rumored console specs.

Considering the process shrink should bring roughly 2x density, ~20-30% clock hikes at similar voltage, and gddr6 (and/or 4Gb GRAM) if not some other tech may rear it's head by that time, it seems a realistic trajectory. See clock/power skew of 28nm in previous post but note TSMC will lower the voltage aim on 20nm...1.264v ain't gonna be refuse anymore certainly to AMD's disappointment. The process will likely wimper out around where most designs hover because of efficiency, 1.15-1.175v (blame a united process with an eye focused on mobile SoCs, ). That means potentially ~1400-1500mhz, minus ~10% for stock skus...or around 1300mhz give or take.

If you ask me.... Nvidia can keep their over priced card that can perform 25% better then the HD8970. I would get 4 HD8970's for the price of two Titans and have much more fun.

All speculation until we hear some real proof.... this article is not enough for me yet.

Click to expand...

I don't think quad crossfire would be that much more fun becuase of all the driver issues in games. And the 4 HD8970 would use more power and produce more heat.
But for pure benchmarking quad crossfire could give better results i guess.

I don't think quad crossfire would be that much more fun becuase of all the driver issues in games. And the 4 HD8970 would use more power and produce more heat.
But for pure benchmarking quad crossfire could give better results i guess.

Click to expand...

AMD haven't exactly been generous with prices either, the 7970 was massively overpriced when it launched too.

Those percentages are an average including games that have CPU bottlenecks, and (if Wizz calculates overall performance using average or total FPS) are also weighted towards the games with the highest FPS. You cannot use them to compare FPS in the way that you are doing. In addition, throughput-based metrics give cards like the 690 an artificially inflated result, which would not necessarily be replicated in latency based testing.

Those percentages are an average including games that have CPU bottlenecks, and (if Wizz calculates overall performance using average or total FPS) are also weighted towards the games with the highest FPS. You cannot use them to compare FPS in the way that you are doing. In addition, throughput-based metrics give cards like the 690 an artificially inflated result, which would not necessarily be replicated in latency based testing.

Finally, the bastards ate profit from KG104 and decided to offer a significant delay product whose price was overpaid by 100%. Now we have pulled kepler life design to reduce GPU processor production to 18 microns. Of course, we will also pay the same patents this year and I hope that pig burst of obesity and gluttony. We know that the driver for the Quadro series DirectX bypasses the problem of efficiency and thereby obtained only from 30% to 100% of the CPU unčinkovitosti and we will have to wait for the Maxwell GPU, which will have the advantage as built. I hope that they will strainers saliva of licensing and contractual profits, just like us for good hardware: P

My previous post was an explanation of the illogicality of yours. But if you can't be bothered with it, the first sentence and the very fact that there are different differences at different resolutions should suffice. It is not so much that the 690 is a different amount faster at different resolutions as it is that the 690 is to a different extent held up by other bottlenecks at different resolutions. In the absence of extra-GPU bottlenecks the 780 (Titan) will be in a worst-case scenario 50% faster than the 680. If memory bandwidth doesn't hold things up, you can call that figure 70%. Of course in reality we'll see a much, much smaller difference, because CPU bottlenecks will hold things up to at least some degree in nearly every game.

My previous post was an explanation of the illogicality of yours. But if you can't be bothered with it, the first sentence and the very fact that there are different differences at different resolutions should suffice. It is not so much that the 690 is a different amount faster at different resolutions as it is that the 690 is to a different extent held up by other bottlenecks at different resolutions. In the absence of extra-GPU bottlenecks the 780 (Titan) will be in a worst-case scenario 50% faster than the 680. If memory bandwidth doesn't hold things up, you can call that figure 70%. Of course in reality we'll see a much, much smaller difference, because CPU bottlenecks will hold things up to at least some degree in nearly every game.

I've decided to save the $900 bucks to buy this card and get a new awesome television and PS4 or Xbox 720 instead. How can that not be a better choice?

I hope that this price point doesn't become the standard for high end single GPU cards.

Click to expand...

I kinda feel this is year is a turning point in the GPU business and in the broader picture the PC business. Low end will disappear replaced by integrated graphics, mid range cards will be hard to sell over 200$ and there will be an enthusiast niche covered by the likes of Titanium and whatever AMD comes up with. The rest of us will game on consoles, tablets, laptops and smartphones. It's the same as with the music industry, there are a handful of people who still buy vinyl records, stare at cover and have expensive Hi-Fi equipment, they like to clean their records and upgrade the equipment while the rest are enjoying low-fi MP3s played through portable devices which they will throw away gladly when a new gadget will come along. Mind you, these are priced in the vicinity of a few pizzas or a good night out.

There will be a market for Titanium same as there's a market for outrageously priced DACs and headphones. Crysis 3 (if one's interested) can be played on a console. No big deal, single player, couple of days of fun then forget about it. Life goes on.