More details have surfaced about NVIDIA's upcoming GPU which is on the low-end of the GeForce GTX 400 series, the GTX 465. Chinese website eNet.com.cn published some lesser known specifications about the GPUs, which pieced together with known details more or less completes the picture. The GTX 465 has not four, but five streaming multiprocessors (SMs) disabled from the GF100 core, yielding 352 CUDA cores (against the earlier known number of 384). With a 256-bit wide GDDR5 memory interface, the GPU has 32 out of 48 ROPs enabled, and makes use of 1 GB of memory.

Out of the few benchmarks the GTX 465 was put through, it scored 5488 points in the eXtreme preset of 3DMark Vantage, which is roughly 20% less than what a GTX 470 would manage. It is said to have outperformed the ATI Radeon HD 5870 in Far Cry 2, while was slower than Radeon HD 5830 in Crysis Warhead. The GeForce GTX 465 will be launched on the 3rd of June, at the Computex 2010 event.

Swinging the old farcry 2 bat again, that game is so Nvidia optimized up the ass it's not worth shit, a real game like crysis that demands absolute GPU power, separates the men from the boys.

If it is priced correctly like at £140 here in the UK it may do well, but as Nvidia/AIB's and it's resellers in the UK are currently ripping of consumers with their extortionate pricing, I very much doubt it and can see it coming in at £200 or above in which case they can take a flying f**k.

Swinging the old farcry 2 bat again, that game is so Nvidia optimized up the ass it's not worth shit, a real game like crysis that demands absolute GPU power, separates the men from the boys.

If it is priced correctly like at £140 here in the UK it may do well, but as Nvidia/AIB's and it's resellers in the UK are currently ripping of consumers with their extortionate pricing, I very much doubt it and can see it coming in at £200 or above in which case they can take a flying f**k.

why the hell do they even go with 256 bit GDDR5 if they're going to clock it that horribly slow, i really don't understand this, it isn't really hard or expensive to put 5gbs GDDR5 on these cards like 5850's have..... sheez

this thing wont end up much faster (if at all) than a GTX285, in fact maybe even slower at first. I guess we'll see, but at this stage it looks like a useless card, unless drivers work serious magic.

i just facepalmed, what is the point of this, okay its for lowend power consump....IM done with this card

Click to expand...

Point is to use all the dud Fermis. Wait for GTX 460, it'll be "better". Anyhow, I'm almost done with ATI, you simply can't buy a 5850, 6months after release they aren't available (for decent price). So I might just get one of these or GTX 460.

If it is priced correctly like at £140 here in the UK it may do well, but as Nvidia/AIB's and it's resellers in the UK are currently ripping of consumers with their extortionate pricing, I very much doubt it and can see it coming in at £200 or above in which case they can take a flying f**k.

Click to expand...

the cheapest 5830 i can find is £180 so thats the pricepoint it will try to compete at.
I'll pass

The problem with all these "disabled" GPU's is that they get less and less efficient. Less performance per watt. Less green. Less cool. Less interesting.

It's like taking a top of the line v12 engine, then making cheaper cars by disabling cylinders. It just doesnt work. Yes, you get a weaker performer at a lower pricepoint, but at the same time, less efficient to boot.

Fermi is failing.While the top-end might have had its "top player"credits, these weaker siblings are embarrasing. Really, a 200W card with only mid performance. Nasty.

The problem with all these "disabled" GPU's is that they get less and less efficient. Less performance per watt. Less green. Less cool. Less interesting.

It's like taking a top of the line v12 engine, then making cheaper cars by disabling cylinders. It just doesnt work. Yes, you get a weaker performer at a lower pricepoint, but at the same time, less efficient to boot.

Fermi is failing.While the top-end might have had its "top player"credits, these weaker siblings are embarrasing. Really, a 200W card with only mid performance. Nasty.

Click to expand...

You sure about that? Becuase the GTX470 is one of those "disabled" GPU's, and it has a better performance per watt then the GTX480, is more green(uses less power), and is cooler(when fan speed and noise are the same).

And hey, while we are no the subject, the HD5850 is one of these "disabled" GPU's too. Lets look at it. Yep, more performance per watt than the HD5870, is more green(uses less power), and is cooler(again at the same fan speed/noise level).

Hmmm...

And really, the GTX470 is probably the least embarrasing of the Fermi cards right now, with very reasonable performance per watt numbers actually. If you remove the simply amazing HD5000 series, which goes way beyond what has been considered normal for performance per watt up until now, and look at all the other cards in recent history, the GTX470 is prett good for a top-teir card. Beating out previous generation's top-tear cards actually. It beats out pretty much the entire HD4800 series in performance per watt, and the GTX200 series. That probably would have been considered pretty damn good if it wasn't for the HD5000 series simply rocking in power consumption.