Evidence Mounts About Single-Chip GeForce GTX Titan: First Picture and Another Benchmark Result.

As the anticipated launch of the new flagship graphics card from Nvidia Corp. is getting closer, more details about the graphics card emerge. According to multiple media and non-media sources, Nvidia is indeed preparing a new single-chip graphics card based on the GK110 graphics processing unit. The novelty is believed to be extremely fast, yet precise information is unavailable. The new product will be unique and extremely expensive.

The new graphics card from Nvidia will not belong to the GeForce GTX 6-series or 7-series and will simply be called GeForce GTX Titan, which underlines its uniqueness and potentially points to limited edition release. A Danish web-store recently added “Asus GeForce GTX Titan 6GB GDDR5” product into its listing, but quickly removed the unannounced item. The graphics card is indeed based on “GeForce GTX Titan”, has 6GB of onboard memory and has two DVI, one HDMI and one DisplayPort connectors. The board carries “GTXTITAN-6GB5” product model label. The graphics card was prices at DKK7276, which is around $1308.

The first blurry photo of what is claimed to be GeForce GTX Titan has been published by Wccftech web-site. The board looks as long as the GeForce GTX 580 and features only one huge graphics processing unit covered by a heat-spreader and marked as “Nvidia GK110”. The card carries 12*2 memory chips for a total capacity of 6GB accessed across a 384-bit bus. The board has rather tricky power sub-system: the PCB seems to be capable of 9-phase voltage regulation for the GPU, but only 6-phased circuitry (the publisher of the photo claims 8-phase is in place) is installed. The 2-phase voltage regulators for memory chips is located near MIO connectors for SLI multi-GPU configurations. The board will naturally support 4-way SLI and will require 8-pin and 6-pin PCIe power connectors.

A new 3DMark 11 benchmark result of what is claimed to be GeForce GTX Titan has been published by Arab PC World shows that the graphics card is capable of hitting X7377 extreme score, even higher compared last week’s leak. The web-site also revealed a screenshot from GPU-Z, which recognized GK110 graphics processing unit (which has rather strange “10DE-1100” PCIe device ID) and detected rather strange PCI-E 3.0x16@x8 1.1 mode. The screenshot has been altered and the amount of stream processors and clock-speeds were hidden.

A full Kepler GK110 implementation includes 15 SMX units (with 192 stream processors per SMX) and six 64?bit memory controllers. Different products will use different configurations of GK110. For example, Nvidia Tesla K20 deploys 13 SMX, whereas the Tesla K20X has 14 SMXs. Nvidia GeForce GTX Titan is believed to be largely based on the Tesla K20X compute card and therefore features GK110 chip with 2688 stream processors, 224 texture units as well as 384-bit memory bus.

Nvidia GK110 graphics processor

Nvidia GK110 processor was designed with high-performance computing (HPC) in mind and therefore features numerous architectural enhancements designed to speed up highly-parallel computations. For example, GK110 supports higher amount of registers per thread as well as such technologies as Dynamic Parallelism, Hyper-Q, Grid Management Unit and GPUDirect. Theoretically, Hyper?Q (enables multiple CPU cores to launch work on a single GPU simultaneously, thereby dramatically increasing GPU utilization and significantly reducing CPU idle times), Dynamic Parallelism (adds the capability for the GPU to generate new work for itself, synchronize on results, and control the scheduling of that work via dedicated, accelerated hardware paths, all without involving the CPU) and GPUDirect (enables GPUs within a single computer to directly exchange data without needing to go to CPU/system memory) can speed up at least some graphics applications and will be able to boost performance of physics effects computing.

Preview

Been holding out for a worthy successor to my 480SLI setup, Last gen from both AMD and Nvidia were pretty average and I would say midrange rebadged, A true high end card with a large die size(true GTX)is about due! I just hope it's all real, I also hope AMD has a counter card or bring forward next gen release date to reduce prices so we all win.
To me the fools are those who jumped on last gen thinking it was the high end when it's always been 500mm2 die sizes for the high end none of this 294mm2 midrange crappola! But each to their own, If a 650 card or AMD equivalent is all you need then buy it, For those who want top end stuff wether it's for future proofing or just plain bragging rights or benchmarks then buy it! It's not a waste of money to the person who is buying it, Only to YOU!

Preview

How is a $900+ card a successor to a $499 one? Your argument is circular/contradictory in nature. You say 294mm2 $499 GTX680 was a mid-range GPU and only fools bought it and yet you completely missed the part where NV used to sell 484-576mm2 large die flagship GPUs for $499-649. You are making fun of those "fools" who bought GTX680/7970 cards but then think $900+ on what was supposed to be a $499-649 real GTX680 last year is somehow a great deal?

Preview

Where did I mention the money part? I don't give a rats bum about price of the Titan card as long as it falls between the 680 and 690 based on it's performance. I remember paying $850 for a 8800GTS many years ago.
All I want is single card performance that can out do my 480SLI rig by a fair margin, the 680 doesn't cut it as impressive as it is!

Preview

Preview

I don't know about his specific case (ozegamer - pronounced "Aussie-gamer" ), but in Australia we can often pay up to 100% more for things - just because companies can get away with it. It has improved in the last 5 years because Aussie shoppers have headed online to buy things direct from the USA (Australians are the largest group of overseas shoppers online in the USA). I often buy hardware and have it shipped to Oz. There is an official Australian Government inquiry under way into the discrepancy in prices and they've already summoned the likes of Apple, Adobe and Microsoft to explain their positions. Things like an iTune track costs AUD$1.99 (US$2.05). In the USA it is US$0.99.

Preview

I hope things improve for you guys. I heard the prices of PS3 when it launched were outrageous in Australia. I can't even imagine what modern GPUs and PS4 would cost at launch. The Titan might then be over $1,500 AU for you?

Preview

For some things it may be true, but not for computer hardware. In fact if anything its the opposite. Australian prices for hardware is much on par as anywhere. In fact if anything places like MSY are cheaper than anywhere else Ive seen online and theyre local (Australian). Buying hardware overseas is going to end up considerably more expensive than what you could buy for locally.

Preview

2.

If its real then its a monster performer. If AMD releases Radeon HD 8970 on time, the performance difference would be just like the old HD4890 vs GTX285 that is like value performance vs sheer performance. Both have its own markets.

Preview

The type of people who generally buy AMD cards care about spending that $ wisely (i.e., price/performance, voltage control). Not many of them would waste $900-1000 on a GPU, especially not an NV one right now because bitcoins are north of $22+ at the moment. That essentially means HD7970(s) are paid off for many AMD owners and continue crunching $ towards 20nm parts. NV used to charge $499-649 (GTX280/480/580/680) for flagship single GPUs and now they want to raise that price to $900+? I suppose NV sheep will be all over this part. NV will milk them nicely. The Titan's performance should make NV and AMD try that much harder with 20nm parts to make sure they beat it at $500-550 next year.

Preview

80% faster with 235W TDP on the same manufacturing node when GTX680 is loading 180-185W of power in games? Only if NV broke the laws of physics. Also you can't just compare 15 vs. 8 SMX. That automatically assumes GPU clocks are identical. Considering K20X has 732mhz GPU / 5.2GB of GDDR5 and already has a TDP of 235W, my guess is 1058mhz on GK110 is not possible. Perhaps 900mhz is a reasonable clock for this part. It'll probably end up 50-60% faster than GTX680 not 80%.

Preview

Preview

7.

Early on I thought this was a hoax in part due to the nature of everyone's faith in only one source which wasn't independently confirmed. But also matching the specs of K20X-- the higher core count at a lower base clock might not compensate as much as hyped against a similarly priced GTX690. If Titan isn't GTX780, it does give my original thinking some credence as nVidia wouldn't dare beyond a limited edition, it isn't cost effective by any means. But being limited means expensive and not worth the hype by definition, except to those who have saved up considerably.

I hope the faithful were careful for what they wished for; nVidia followed perceived demand, they aren't promising whatever dreams built up over enthusiasts wanting GK110 just because they make it available. Meaning, if they expected more, it isn't nVidia's fault. But at least EK is ready with waterblocks for the Tesla K20 template that should also work with Titan.http://www.ekwb.com/news/...uadro-Tesla-water-blocks/

All that said, xbitlabs should be suspicious when fuzzy photos have the video ports concealed like that, as "Titan" and K20X would have the same PCB and layout. The photo could still be a K20X but presented as a Titan. Also, the performance leaks have already been proven fake (either overclocked 690 or a pair of 680's), and even ArabPCWorld ran that article as well.http://www.arabpcworld.com/?p=25899

But ArabPCWorld also claimed to have a picture of a GK110 card when it was thought to be GTX780 with no markings way back in November 2012, but of course it wasn't a popular rumor so people tend to not believe. I wish it wasn't like that with more skepticism, but the nature of rumors do seem spoonfeed.http://www.arabpcworld.com/?p=22733

Preview

8.

The real question is will it run at the same clock as other GK110 parts or will it run at similar clock to the 680 or higher? If it runs at the same clock as the Tesla parts then it's performance is only about 10% faster then the 680 as while it has a massive increase in cores there clocked about 30% slower. If it has the 2668 cores & runs at 1000mhz+ then it's a real monster.

Preview

Those specs do not make any sense. The entire full blown 15 SMX GK110 chip only has 6 memory controllers, each providing 64-bit memory width, for a total of 384-bit. The 512-bit is a dead giveaway those specs are 99% BS. They also happen to exactly match GPU clocks and memory bus of the GTX690 (2x256 bit = 512 bit).

K20X Telsa has 14 SMX clusters and 732mhz GPU clocks and already a TDP of 235W. How can they raise GPU clocks from 732mhz to 1019mhz and TDP only goes up to 250W? It's not possible unless the Titan is a higher clocked 12-13 SMX part or NV will be selling a card with GTX480's power consumption.