VGA Testing Methodology

The Microsoft DirectX-11 graphics API is native to the Microsoft Windows 7 Operating System, and will be the primary O/S for our test platform. DX11 is also available as a Microsoft Update for the Windows Vista O/S, so our test results apply to both versions of the Operating System. The majority of benchmark tests used in this article are comparative to DX11 performance, however some high-demand DX10 tests have also been included.

According to the Steam Hardware Survey published for the month ending May 2010, the most popular gaming resolution is 1280x1024 (17-19" standard LCD monitors). However, because this 1.31MP resolution is considered 'low' by most standards, our benchmark performance tests concentrate on higher-demand resolutions: 1.76MP 1680x1050 (22-24" widescreen LCD) and 2.30MP 1920x1200 (24-28" widescreen LCD monitors). These resolutions are more likely to be used by high-end graphics solutions, such as those tested in this article.

In each benchmark test there is one 'cache run' that is conducted, followed by five recorded test runs. Results are collected at each setting with the highest and lowest results discarded. The remaining three results are averaged, and displayed in the performance charts on the following pages.

A combination of synthetic and video game benchmark tests have been used in this article to illustrate relative performance among graphics solutions. Our benchmark frame rate results are not intended to represent real-world graphics performance, as this experience would change based on supporting hardware and the perception of individuals playing the video game.

DX11 Cost to Performance Ratio

For this article Benchmark Reviews has included cost per FPS for graphics performance results. Only the least expensive product price is calculated, and does not consider tax, freight, promotional offers, or rebates into the cost. All prices reflect product series components, and do not represent any specific manufacturer, model, or brand. These retail prices for each product were obtained from online retailers on 01-September-2010:

The PNY GeForce GTX 460 OC XLR8 is a great card, and performs roughly 5% behind the ASUS GTX 460 DirectCU TOP. They both cost about the same depending on where you shop, but the PNY version is externally exhausting and offers a lifetime warranty. Considering you could always further OC the PNY card, it seems like the better deal.

In terms of noise, there's almost none... as in no audible sound. Even with the fan turned to 100% power, the PNY cooler is extremely quiet.

Two of these cards in SLI will be sufficient for me. I'm no world class gamer, I just like to do a little shooting on occasion and want to see good performance for a decent price. These things seem to have the low power consumption and heat output of the Radeon solutions, but they also support PhysX and Radeon doesn't yet. At this point, it's a no-brainer decision.

Also, PNY has a RMA graphics card promotion specifically for BFG customers. Send in your BFG card, whether it works or not, and you'll receive a 25 percent discount on select PNY video cards. They'll also throw-in "Just Cause 2" from Square Enix by way of an online download redemption code. So those of you left holding the bag with BFG cards that malfunctioned, can get a better deal than the rest of us with PNY, and a free game. Visit PNY.com or call PNY directly at 1-888-316-1193. The promotion runs until October 31, 2010.

I've been hoping for a review of a reference GTX 460 1GB. Everyone seems to concentrate of third party cooling models and I was genuinely interested in seeing how the reference cooler fairs in a decently overclocked situation from a noise and temperature perspective. Now I see MSI is introducing the GTX 460 Hawk with its "Twin Frozr" type cooler. Any thoughts on if its more effective than the reference (both noise and cooling)?

We've already reviewed two versions of the reference design, and they're linked in each of our articles. The 1GB GTS 460 is located here: benchmarkreviews.com/index.php?option=com_content&task=view&id=558&Itemid=72

Just for your information, Twin Frozr performs better than reference heatsink, but it seems that Cyclone's cooler performs similar. Overall, GTX460 don't have temperature problems even with reference cooler unless mounting an SLI or having poor air-flow on your case.

This will be very difficult, since the last few video cards (GTX 465, GTX 460, GTS 450) have all been completely silent. Most sound level meters pick-up noise above 40 dBA, which is above where these products operate. For reference, my quiet Noctua 120mm fan is more audible than these video cards.

You're right though, I haven't been making mention of the sound levels. I'll update my articles to make note of this. Thanks.

On page 13 of the review (the Metro2033 page), there are 2 graphs shown side-by-side in the single image. What does each graph represent? Obviously, they're showing frames-per-second, and I'm assuming that each graph represents a different group of settings, but what are the settings for each graph? Or is it a Min/Max type of thing? Also, under the graph (also on page 13), we find this statement: "When their flagship GeForce GTX 480 struggles to produce 27 FPS" but the GTX480 is not represented in any of the graphs on page 13.

I guess I can't edit my comment above - I wanted to add that in the chart at the bottom of page 13 (and several other pages), numerous video cards are listed with their individual specs. The GTX460 is listed as having 1024mb DDR5 (1gb) and 192-bit bus. Don't all 1gb GTX460s have a 256-bit bus? And the 768mb models have the 192-bit bus?

It seems that my specifications chart got buggered when one row was deleted, pushing the values over for all cards. This chart was then copied to each test result page, unfortunately repeating the error. They've since been fixed.

In Metro 2033, the two results are 1920x1200 and 1680x1050. I've since replaced the chart with an updated image.

I have this very same card from PNY (1GB version). Two of them actually. And for the life of me I cannot get the fan speed to go past 70%. I have checked everywhere for a PNY one, and even called PNY and they had no idea what I was talking about. I even tried to use the EVGA update that fixes it for the EVGA cards and none of them detect the cards. Does anyone with these same cards know a way to get the fan speed to 100%?

People mention noise a lot with cards, I don't mind noise, i want the fan speed. Any help would be appreciated.

Like most NVIDIA cards since the 400-series, you cannot always reach 100% fan output. Some of the higher-end products that need it will reach this level (GTX 480), but most will not. This isn't a problem with the video card, but rather a programmed function of the BIOS.

Same for me as well (just checked with MSI Afterburner). I haven't had mine on anything but auto since I bought it (also have the regular 675 Mhz version, not OC). Even clocked at 825 Mhz its stays quite cool - no hotter than 68C.

Yeah. EVGA and Gigabyte have released BIOS for their respective cards. Their BIOS allow the cards to go to "maximum" speed (theirs is stuck at 70% before the update)while PNY hasn't. Initially my card would stay about 68C before. Even while the card was overclocked. But for a couple of days now, even with my same overclock settings, the card will get to 80C and crash the game out. I purchased the card the Friday after Thanksgiving. So I got them recently.

Presuming the speeds were similar (but preferrably the same), you could flash the BIOS from of the unlocked fan to the PNY card. We even have a guide showing you how: benchmarkreviews.com/index.php?option=com_content&task=view&id=205

Yeah, flashing using Nvflash is pretty easy and I would think a BIOS from a reference GTX 460 of any brand should work. I'm leaving mine alone though. It runs cool and is incredibly quiet on 40% (which it never goes above on auto). I bought mine in late September and its been doing great.

I just upgraded from two 9800GTs in SLI to two GTX 460's. Problem is I only have two PCIE 6-pin power connectors. That was fine with the 9800GTs because they only required one each but the GTX 460s require 2 PCIE 6-pin connectors each. I know I have more than enough power available in my 700Watt PS. Are there splitters available that can take a single 6-pin and make it two 6-pins? Any suggestions?

In a pinch, you can use the 4-pin Molex (Think old HDD Power connector)to PCI-E adapters. Try to use different branches from the PSU to feed the two cards, don't daisy-chain everything together if you don't have to. Most video cards come with a couple of them included in the accessories. DON'T try to double up on the existing PCI-E connectors.

I only have two available 4 pin Molex connectors. The adapter that came with the card uses the two Molex adapters to power one PCIE 6-pin connecter. That leaves me one short, still no good.

So the moral of the story is don't purchase these cards to replace your 9800GTs in SLI unless you are sure you have enough connectors from your existing PS, or unless you are replacining your PS as well.

Noted the question about splitting voltage source. When voltage is spread over added loops, the current (amperage) is deivided amonst them. This can lead to underpowered units, if the current ratings, required by the card (s) is based on each having full available PS. The switching PS may accomodate or adjust for this config, but to insure tht all units get the required amperage, you should check the connections on the PSS. The modular PSS seems to be applicable, as you can add lines PRN (as needed) with, hopefully, each getting its full share of the power. I just learned that the MOBO nad the BIOS can also have some effect or limitations on what you put into them. i.e. CPU2-Quads may require 105 watt MOBO, and the Intel DG965RY (mine) is only rated at 95 watts, hence the Quad may not be an option for this mobo. which leads to more questions...Watch the power dividing, as it can lead to underpowered unit