A Different Sort of Launch

Fermi will support DirectX 11 and NVIDIA believes it'll be faster than the Radeon HD 5870 in 3D games. With 3 billion transistors, it had better be. But that's the extent of what NVIDIA is willing to talk about with regards to Fermi as a gaming GPU. Sorry folks, today's launch is targeted entirely at Tesla.

A GeForce GTX 280 with 4GB of memory is the foundation for the Tesla C1060 cards

Tesla is NVIDIA's High Performance Computing (HPC) business. NVIDIA takes its consumer GPUs, equips them with a ton of memory, and sells them in personal or datacenter supercomputers called Tesla supercomputers or computing clusters. If you have an application that can run well on a GPU, the upside is tremendous.

Four of those C1060 cards in a 1U chassis make the Tesla S1070. PCIe connects the S1070 to the host server.

NVIDIA loves to cite examples of where algorithms ported to GPUs work so much better than CPUs. One such example is a seismic processing application that HESS found ran very well on NVIDIA GPUs. It migrated a cluster of 2000 servers to 32 Tesla S1070s, bringing total costs down from $8M to $400K, and total power from 1200kW down to 45kW.

HESS Seismic Processing Example

Tesla

CPU

Performance

1

1

# of Machines

32 Tesla S1070s

2000 x86 servers

Total Cost

~$400K

~$8M

Total Power

45kW

1200kW

Obviously this doesn't include the servers needed to drive the Teslas, but presumably that's not a significant cost. Either way the potential is there, it's just a matter of how many similar applications exist in the world.

According to NVIDIA, there are many more cases like this in the market. The table below shows what NVIDIA believes is the total available market in the next 18 months for these various HPC segments:

Processor

Seismic

Supercomputing

Universities

Defence

Finance

GPU TAM

$300M

$200M

$150M

$250M

$230M

These figures were calculated by looking at the algorithms used in each segment, the number of Hess-like Tesla installations that can be done, and the current budget for non-GPU based computing in those markets. If NVIDIA met its goals here, the Tesla business could be bigger than the GeForce one. There's just one problem:

As you'll soon see, many of the architectural features of Fermi are targeted specifically for Tesla markets. The same could be said about GT200, albeit to a lesser degree. Yet Tesla accounted for less than 1.3% of NVIDIA's total revenue last quarter.

Given these numbers it looks like NVIDIA is building GPUs for a world that doesn't exist. NVIDIA doesn't agree.

The Evolution of GPU Computing

When matched with the right algorithms and programming efforts, GPU computing can provide some real speedups. Much of Fermi's architecture is designed to improve performance in these HPC and other GPU compute applications.

Ever since G80, NVIDIA has been on this path to bring GPU computing to reality. I rarely get the opportunity to get a non-marketing answer out of NVIDIA, but in talking to Jonah Alben (VP of GPU Engineering) I had an unusually frank discussion.

From the outside, G80 looks to be a GPU architected for compute. Internally, NVIDIA viewed it as an opportunistic way to enable more general purpose computing on its GPUs. The transition to a unified shader architecture gave NVIDIA the chance to, relatively easily, turn G80 into more than just a GPU. NVIDIA viewed GPU computing as a future strength for the company, so G80 led a dual life. Awesome graphics chip by day, the foundation for CUDA by night.

Remember that G80 was hashed out back in 2002 - 2003. NVIDIA had some ideas of where it wanted to take GPU computing, but it wasn't until G80 hit that customers started providing feedback that ultimately shaped the way GT200 and Fermi turned out.

One key example was support for double precision floating point. The feature wasn't added until GT200 and even then, it was only added based on computing customer feedback from G80. Fermi kicks double precision performance up another notch as it now executes FP64 ops at half of its FP32 rate (more on this later).

While G80 and GT200 were still primarily graphics chips, NVIDIA views Fermi as a processor that makes compute just as serious as graphics. NVIDIA believes it's on a different course, at least for the short term, than AMD. And you'll see this in many of the architectural features of Fermi.

Post Your Comment

415 Comments

huge difference between INTEGRATED GRAPHIC SEGMENT, wich is almost every laptop there and a lot of business computers.
versus the DISCRETE MARKET, wich is the GAMING section... where AMD-ATI and NVIDIA are 100%.
get your facts and see a doctor, your delusional attitude is getting annoying. Reply

Gee, you certaibnly are not a computer technician.
That you even POSIT that games aren't played on INTEL GPU's is an astounding whack.
Nvidia and ati have laptop graphics, as does intel, and although I don't have the numbers for you on slot cards vs integrated, your whole idea is another absolute FUD and hogwash.
INTEL slotted graphics are still around playing games, bubbba.
You sure don't know much, and your point is 100% invalid, and contains the problem of YOUR TINY MIND.
Let me remind you Tamalero, you shrieked and wailed against nvidia for INTEGRATED GRAPHICS of theirs on a laptop !
Wow, you're a BOZO, again. Reply

The performance and awesomeness of a company campared to another is biaised. Sure Nvidia probably has historically done better than ATI.

I hope that ATI as a company does much better this year and next, so that the there is greater competition. Competition which will consequently mean smaller profit margins, but better deals for us consumers!

At the end of the day, who cares who's winning? Shouldn't we all be hoping that each does well? Shouldn't we all hope that there will always be several major graphics providers? Do we really want a monopoly on GPU's? How would this effect the price on a performance card?

I think you should be banned SiliconDoc. You're adding no real value here. Leave.

I have a GTX260 btw. So i'm not speaking from bias. Wander what kind of card you have? lol... Reply

What card you have or don't have doesn't matter one whit, but what you claim DOES. What you SPEW does !
and when you lie expect any card to save you.

If liars were banned, you'd all be gone, and I'd be left. ( that does not include of course those who aren't trolling jerkbots running around to my every post wailing and whining and saying ABSOLUTELY NOTHING )
-
" Sure Nvidia probably has historically done better than ATI."

since you 1. obviously haven't got a clue wether what you said is true or not 2. Why would you even say it, with your stupidty retaining, lack of knowledge, or lying caveat "probably" ?

If you're so ignorant, please shut it on the matter! Do you prefer to open your big fat yap and prove how knowledgeless you are ? I guess you do.
If you don't know, why are you even opening your piehole about it ?

It certainly doesn't do anything for me if you aren't correct and you don't know it ! I don't WANT YOUR LIES, nor your pie open when you haven't a clue. I don't want your wishy washy CRAP.
Ok ?
Got it ?
If you open the flapper, make sure it gets it right.
-
If you actually are an enthusiast, why is it that the result is, you blather on in bland generalities, get large facts tossed in a fudging, sloppy, half baked inconclusive manner, and in the end, wind up being nothing less than the very red rooster you demand I believe you are not.
What a crappy outcome, really.---
--
Frankly, you cannot accept me even telling the facts as they actually are, that is too much for your mushy, weak, flakey head, and when I do, you attribute some far out motive to it !
There's no motive other than GET IT RIGHT, YOU IDIOTS !
--
What do you claim, though ?
Why is it, you have such an aversion to FACTS ? WHY IS THAT ?
If I point out ati is not in fact on top, but last, and NVIDIA is almost double ati, (to use the "authors" comparison techniques but not separate companies for "internal comparisons" and make CERTAIN I exagerrate) - why are you so GD bent out of shape ?
I'll tell you why...
YOU FORBID IT.
I certainly don't understand your mindset, you'd much prefer some puss filled bag of slop you can quack out so "we can come to some generalization on our desires and feelings" about "the industry".
Go suck down your estrogen pills with your girlfriends.
---
I don't care what your feelings are, what flakey desire you have for continuing competition, because, you prefer LIES over the truth.

Instead of course, after you whining in some sissy crybaby pathetic wail for the PC cleche of continuing competition, you'll turn around and screech the competition I provide to your censored mindset is the worst kind you could possibly imagine to encounter ! Then you wail aloud "destroy it! get rid of it ! ban it ! "
LOL
You're one piece of filthy work, that's for sure.
---
So, you want me to squeal like an idiot like you did, that you want lower prices and competition, and the way to get that is to LIE about ati in the good, and DISS nvidia to the bad with even bigger lies ?
I see.. I see exactly !
So when I point out the big fat lying fibs for ati and against nvidia - you percieve it as a great threat to "your bottom line" pocketbook.
LOL
Well you know what - TOO BAD ! If the card cannot survive on FACTS AND THE TRUTH, then it deserves to die.
Or is honesty banned so you can fan up ati numbers with your lies, and therefore get your cheaper nvidia card ?
--
This is WHAT YOU PEOPLE WANT - enough lies for ati and against nvidia to keep the piece of red crappin ?
LOL yeah man, just like you jerks...
---
" At the end of the day, who cares who's winning? "
Take a look at that you flaked out JERKOFF, and apply it to this site for the YEARS you didn't have your INSANE GOURD focused on me.
Come on you piece of filth, take a look in the mirror !
It's ALL ABOUT WHOSE WINNING HERE.
THE WHOLE SITE IS BASED UPON YOU LITTLE PIECE OF CRAP !
---
And of course worse than that, after claiming you don't care whose winning, you go on to spread your hope that ati market share climbs, so you can suck down a cheapo card with continuing competition.
So what that says, is all YOU care about is your money. MONEY, your money.
"Quick ban the truth! jerkoffs pocketbook is threatened by posts on anandtech because this poster won't claim he wants equal market share !"
--
Dude, you are disgusting. You take fear and personal greed to a whole new level.

Why that's great, let's see what you have for any rebuttal to this clone you claim is me:
" Even the gtx260 uses less power than the 4870.

Pretty simple math - 295 WINS.

Cuda, PhysX, large cool die size, better power management, game profiles out of the box, forced SLI, better overclocking, 65% of GPU-Z scores and marketshare, TWIMTBP, no CCC bloat, secondary PhysX card capable, SLI monitor control "
---
Anything there you can refute ? Even just one thing ?
I'm not sure what your complaint is if you can't.
That's the text that was there, so why didn't you read it or try to claim anything in it was wrong ?
Are you just a little sourpussed spasm boy red, or do you actually have a reason ANY of that text is incorrect ?
Anything at all there ? Are you an empty shell who copies the truth then whines like a punk idiot ? Come on, prove you're not that pathetic. Reply

less margins indeed. At a business point of view though that's good anyway, not because less margins is good for business (of course not) but for what it implies. It means factories can keep on maximum production - and that's very important. There's less profit for each sale but number of consumers is bigger in an exponential way. So not bad indeed... even for them - good for all. Reply