ChronoReverse wrote:I thought the 290X benches equal or slightly better than the 780? Why are all the 290X results slightly lower than the 780 on the graphs?

Hardware benchmarks are a highly subjective thing, my dear friend One site can show one result, the other might show different one. This is why everyone should buy the hardware themselves, evaluate it in their own environment then decide if it's worth using it or better to return it, or sell it on eBay (or give it away or just keep it in a closet ).

My subscription allows you people to exist on this site and makes me a better human being than you'll ever be

ChronoReverse wrote:Still, if true, this means greater than Titan performance across the board? That'd be mighty impressive.

That's actually the expected part- it's a fully-enabled GK110, which we've not yet seen anywhere near the consumer space. It still loses the compute goodies that the Titan has, though, and thus the GTX780Ti is purely a gaming card, unlike the Titan.

As for comparisons with the 290X, and final pricing, we're going to have to wait and see. Still need third-party coolers for the 290(X) cards before we can really judge the price/performance/features picture, and those that prefer/need blowers will still likely better be served by Nvidia.

In the reviews i checked out the 290x (in Uber mode) beat the GTX Titan in BF 3, which is expected considering that the developer partnered with AMD. That said, the GTX 780 Ti looks mighty impressive, almost matching the GTX 690...

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

ChronoReverse wrote:I thought the 290X benches equal or slightly better than the 780? Why are all the 290X results slightly lower than the 780 on the graphs?

Those graphs seem to show that the 290X is faster when I look at them. The only game benchmark where the 780 is faster is when you enable DDOF in Bioshock Infinite, and even then it's only faster by a small amount. The other 2 games the 290X is faster at every resolution they used (with or without AA)

I think it's also worth noting GK110 has more die area than Hawaii. I wonder if the 780Ti will run about the same temperature (higher wattage, but larger die and therefore heat dissipation area). These 28nm wars are getting hot.

780Ti is a fully enabled GK110 (2880 shaders) and runs an gratuitously high base clock for such a large part: 1050Mhz. I'd love to have one, but there's no way I could afford it, unless some enterprising CUDA nerd wanted to trade me one for my Titan... hehe. ┐(￣ヮ￣)┌

auxy wrote:780Ti is a fully enabled GK110 (2880 shaders) and runs an gratuitously high base clock for such a large part: 1050Mhz. I'd love to have one, but there's no way I could afford it, unless some enterprising CUDA nerd wanted to trade me one for my Titan... hehe. ┐(￣ヮ￣)┌

Well, if the supply of Titans were to dry up, yours could be worth more than you paid for it. Might want to go Craigslist fishing?

auxy wrote:780Ti is a fully enabled GK110 (2880 shaders) and runs an gratuitously high base clock for such a large part: 1050Mhz. I'd love to have one, but there's no way I could afford it, unless some enterprising CUDA nerd wanted to trade me one for my Titan... hehe. ┐(￣ヮ￣)┌

Well, if the supply of Titans were to dry up, yours could be worth more than you paid for it. Might want to go Craigslist fishing?

I didn't pay for mine, and I don't want to sell it. I don't like getting money directly involved. Rather trade.

sschaem wrote:The 780 ti and 290x seem to be a exact match with BF3 at 4K... but here is the kicker, if this chart is true, the 780ti uses way more power.

If only nvidia where to add gsync to seiki 39" 4K TV.... $699 + $150... $849 for a 4K 120hz Gsync gaming monitor, seem like a great deal.

You may as well make up a fantastical ~40" 4K TV brand. 120Hz at 4K is 4x the input bandwidth that the current Seikis support. I'd imagine the required ASIC upgrade alone to make that happen would add significantly to the price of the TV, nevermind Gsync.

cynan wrote:You may as well make up a fantastical ~40" 4K TV brand. 120Hz at 4K is 4x the input bandwidth that the current Seikis support. I'd imagine the required ASIC upgrade alone to make that happen would add significantly to the price of the TV, nevermind Gsync.

It's not a reasonable expectation, but it's definitely within the realm of possibility. The panel is easily capable of 120Hz operation, so you're right, it just needs a controller to handle the signalling. Further, the G-Sync module is $100, and that's for a custom FPGA. A dedicated ASIC will likely be less than a tenth that cost to produce in volume.

Last, there's nothing stopping anyone from using multiple DP streams to push 120Hz at 4k; and with G-Sync, it's not about pushing 120Hz per se, as much as it is about keeping the time it takes to send a single frame as short as possible.

Chrispy_ wrote:AMD and Nvidia have neither moved to a smaller fabrication process, nor have they made any significant architectural changes; These things are faster because they're bigger, hungrier and noisier.

Now that's the truth. Though you have to wonder just how much of an impact running a full-GK110 hot is going to have over the Titan or 780. It'll be hotter and louder sure, assuming no tweaks have been made to the GPU or the cooler, but will it be louder than an R9 290X with it's training wheels removed?

If you'll read the review, you'll see it's at least up to the standards of TR's reviews. It may not have the prosaic writing that is a hallmark of TR's, but from a technical point of view, it's bang-on!

although that being said, I want my TR review...something about having my cake and eating it, too.