Other than silicon increase the only thing that affects power consumption is clock speed which I speculated is going to be the same more or less, and that it won't affect too much: as can be seen on the chart I provided @820 Mhz (quite an OC) the power is 176 W.

Click to expand...

your whole mathemetics based on unitary method is not supposed to be used on electricals components.TDP of FULL gf100 is around 204 watts more than gf100.based on your mathematics can you explain why the TDP OF FULL GF100 IS QUITE MORE AS EXPECTED?

your whole mathemetics based on unitary method is not supposed to be used on electricals components.TDP of FULL gf100 is around 204 watts more than gf100.based on your mathematics can you explain why the TDP OF FULL GF100 IS QUITE MORE AS EXPECTED?

Click to expand...

Do you really believed that to be real? That was not coming from Nvidia. How naive can you be? First of all the revision code (A1, A2, A3) was not shown so that card if it was real was probably an old prototype card using A1 or A2 silicon and why did Nvidia use A3? Ah yes, because A1 and A2 were broken. Other posibility is that they re-enabled a normal GTX480, and oh yes, there was a reason that SM was disabled...

If you want to know more exactly how enabling SPs/ROPs trully affect power consumption on real cads, then look at GTX470 vs GTX465, because they both have the same clocks. And how much is that then again?

wow so it looks like actual power consumption of actual GTX470 is lower than my math. I knew that from the beginning, my assumptions above are for worst case scenario.

GF100 consumed a lot, everybody knows that. Everybody should know by now too, that it was a problem with the fabric (interconnection layer between the different units within a chip) and that the problem has already been fixed. It was mostly fixed for GF104 as can be seen by its power consumtion and is probably even better for other future releases.

They must be upping the clock speed a lot. We already saw that going to 512 shaders alone only adds 5%, and I don't think the memory bandwidth is going to add the other 15%. Managing that in the same power consumption as the 480 might actually have taken a lot of work.... assuming their claims aren't total BS.

Wait, 20% faster than a 480 is like a 5870 oc'd (well something more) but it is definetly slower than a promised 6870... They are not even getting even on paper now?

Click to expand...

what are you on? can i get some of that?? the hd5870 cannot even hold ground against a non overclocked GTX480. a 20% faster GTX480 will annihilate the HD5870 and HD6870. remember the 6870 isnt a hd5870. its basically a 5850. amd naming is relaly messed up with this generation

And for people who are obsessing over the potential high heat and consumption. Just wait and see, its not worth getting all worked up about it now. Maybe nvidia has new strategies to over come this. Better reference coolers perhaps

It's really sad tbh, sadder every day. Every day that passes is more difficult to talk about tech.

If only GTX460 had never been released, then maybe, maybe, the comments wouldn't be so baseless. But the GTX460 was released and it's faster than the GTX465 while consuming 50 W less... Is it posible for Nvidia to repeat that achievement with GF110, or maybe even... take a sit AMD fanboys... further improve the efficiency a little bit?

- No, because I'm so biased I cannot even see what's in front of my eyes.

- No, because AMD has apparently improved power efficiency further with NI, but there's no way on Earth or otherwise for Nvidia to catch up. Imposible! I mean, Nvidia never had better power efficiency than AMD/Ati. Never, never, never... hmm well... hmmmmokey "only" before Evergreen/Fermi.

It's really sad tbh, sadder every day. Every day that passes is more difficult to talk about tech.

If only GTX460 had never been released, then maybe, maybe, the comments wouldn't be so baseless. But the GTX460 was released and it's faster than the GTX465 while consuming 50 W less... Is it posible for Nvidia to repeat that achievement with GF110, or maybe even... take a sit AMD fanboys... further improve the efficiency a little bit?

- No, because I'm so biased I cannot even see what's in front of my eyes.

- No, because AMD has apparently improved power efficiency further with NI, but there's no way on Earth or otherwise for Nvidia to catch up. Imposible! I mean, Nvidia never had better power efficiency than AMD/Ati. Never, never, never... hmm well... hmmmmokey "only" before Evergreen/Fermi.

Click to expand...

You just need to laugh at the stupidity and remember thats all it is, if the amd fanboys are so interested in power draw why are they not complaining about the over 300w tdp of the 5870x2 cards? (no not the 5970 i mean the ones with the speeds fo the 5870) things with 2 8 pin and one 6 pin?

I'm no fan boy for either side as they both suck in their own special ways and are both awesome in the ways as well, and although stupidity annoys me i can ignore the stupidity in the fanboy comments for both amd and nvidia..... how about you 2 join me in laughing at the stupidiy?

Back on topic i'm really excied about seeing what nvidia will be bringing out, i just hope its very close to the 6970's release date as im sick of waiting for my gpu upgrade and im starting to think i would want something more powerful than 1gb 460 sli so hopefully either amd or nvidia can give me something that fits the bill and before the end of the year.

I believe this is just Nvidias PR trying to damper AMD's launch. I sure hope Nvidia could put something out there to compete with Cayman, because it would definitely help bring the prices down. But somehow it's hard to believe Nvidia has got anything to counter Cayman and Antilles... Well, we'll see.

It's really sad tbh, sadder every day. Every day that passes is more difficult to talk about tech.

If only GTX460 had never been released, then maybe, maybe, the comments wouldn't be so baseless. But the GTX460 was released and it's faster than the GTX465 while consuming 50 W less... Is it posible for Nvidia to repeat that achievement with GF110, or maybe even... take a sit AMD fanboys... further improve the efficiency a little bit?

- No, because I'm so biased I cannot even see what's in front of my eyes.

- No, because AMD has apparently improved power efficiency further with NI, but there's no way on Earth or otherwise for Nvidia to catch up. Imposible! I mean, Nvidia never had better power efficiency than AMD/Ati. Never, never, never... hmm well... hmmmmokey "only" before Evergreen/Fermi.

Love it. Just because a majority of people (pretty much across the internet) feel one way, it's bias. Of course, you're just a "fanboy" for pointing that out. It's all tounge and cheek, but it's either bias or the unwillingness to accept the truth from one "side" or the other. Fact of the matter is, Nvidia has failed to impress many.

Again a thread full of smack talk for a GPU we know next to nothing about, bar a few who actually have something to contribute. ohwell, can't say I'm not dissapointed.

I have a feeling this will just be a GF104 + 50% of it again, making it 576sp, back to 48 ROPS, and still 384-bit GDDR5, they just need to improve the memory controller, and make sure they hit a good power consumption target.

Love it. Just because a majority of people (pretty much across the internet) feel one way, it's bias. Of course, you're just a "fanboy" for pointing that out. It's all tounge and cheek, but it's either bias or the unwillingness to accept the truth from one "side" or the other. Fact of the matter is, Nvidia has failed to impress many.

Click to expand...

Offering constructive criticism is no issue at all, everyone should be able to point out mistakes and criticize it in hopes of future improvements. Look around, nothing but nonsensical counterproductive comments yet again from a site that i frequently visit, the majority can say whatever BS they want, but i honestly expect more out of sites like this.

No doubt. Considering all the problems I've been complaining about on here, you'd think nV would be "my best friend" at this point, however, the Fermi launch put a big damper in thier hype, and nothing since then has really done much to improve consumer confidence, as evident in the shift in gpu marketshare.

They need a killer product. 20% over GTX480 doesn't cut it....they need 33% or so...then they'd have a real chance.

Mind you the current rumour about price drops is definately gonna work in thier favor.

Offering constructive criticism is no issue at all, everyone should be able to point out mistakes and criticize it in hopes of future improvements. Look around, nothing but nonsensical counterproductive comments yet again from a site that i frequently visit, the majority can say whatever BS they want, but i honestly expect more out of sites like this.

Click to expand...

What BS? A lot of it is not false, though a lot of it is quite short-sided or just blunt. I can say that I'm not particularly impressed with certain things on both sides of the coin.

20% sounds nice... I just wonder how much room was left for overclocking? I mean 20% on a 480 should be achievable on a 480, but how much more can you push it?.. I guess time will tell. A 580 that is 20% faster than a 480 plus being able to push it another 20% would be kickass! I don't know... i hope so for everyone.. We need both to do well to keep price's right. you know?

I can say that I'm not particularly impressed with certain things on both sides of the coin.

Click to expand...

Well that's just it, isn't it?

Both sides kinda have issues, and so really, I blame TSMC.

Bear Jesus, while I can afford to just get rid of stuff and start over again, to me, that'd make all the time I've spent trying to get things working a waste of time. As far as I am concerned, either AMD fixes the issues I have, or they fail. I can't truly say that unless is see this out to the end.

But, because my usage(3 monitors) dictates I need a certain level of performance, I'm just plain out of options at this point. The 69xx-series is my last hope, or maybe this GTX580 can pull up nV's socks, and I'll switch over.

I am not "sticking it out" because I'm a fanboy...I need 60+FPS, and @ 5870x1080. A little bit of AA would be nice too. The first company that can do this, gets my cash.

For anyone else...they don't need GTX580. Seriously...a 480 is more than enough power, for anyone with a single monitor.

No doubt. Considering all the problems I've been complaining about on here, you'd think nV would be "my best friend" at this point, however, the Fermi launch put a big damper in thier hype, and nothing since then has really done much to improve consumer confidence, as evident in the shift in gpu marketshare.

They need a killer product. 20% over GTX480 doesn't cut it....they need 33% or so...then they'd have a real chance.

Mind you the current rumour about price drops is definately gonna work in thier favor.

Click to expand...

At least someone who thinks like me.
Bottom line: 20% over GTX480 is not enough, considering Cayman XT will be at the very least 30% - 40% faster than Cypress XT, that would make it go neck on neck(equal performance) with 'GTX580', and as everyone knows, due to amd chips being cheaper to produce, amd can lower prices more, and still make some profit.

I think that gtx 580 will be a winner if it is slightly cooler than gtx 480 and keeps the same OC potential and perfomance scaling. I own the gtx 480 and I will happily upgrade for gtx 580. many of you speak without knowing the truth. my gtx 480 OC 830/1100 doesnt't get passed 81 degrees Celsius and is almost as performant as ati 5970. and that without facing the retarded ATI drivers. you also get physx CUDA and the best minumum framerate which is most important when you playing games