Battlefield 3

Battlefield 3™ leaps ahead of its time with the power of Frostbite 2, DICE's new cutting-edge game engine. This state-of-the-art technology is the foundation on which Battlefield 3 is built, delivering enhanced visual quality, a grand sense of scale, massive destruction, dynamic audio and character animation utilizing ANT technology as seen in the latest EA SPORTS™ games.

Frostbite 2 now enables deferred shading, dynamic global illumination and new streaming architecture. Sounds like tech talk? Play the game and experience the difference!

Our Settings for Battlefield 3

Here is our testing run through the game, for your reference.

While all three cards are able to keep Battlefield 3 running well at 1920x1080, in the FRAPS based information the HD 7970s in CrossFire that are emulating the HD 7990 are clearly the performance leader followed by the GTX 690 with the GTX Titan rounding things out.

Well, things change quickly around these parts and you can see that for the HD 7990 the observed frame rate after removing any runts or drops has come down considerably.

Our plot of frame times from the Frame Rating capture technology shows two interesting items. First, the HD 7970s in CrossFire result in an alternating fast/slow frame times, usually indicative of runts, frames that take up so little of the screen's scanlines that they aren't positively affecting apparent performance. Also, even though the GTX 690 has better frame rates than the GTX Titan, it definitely has more frame time variance as evident by the wider blue band of color on the image above.

Minimum frame rates after taking out the runts result in an observed FPS average of about 102 FPS for the HD 7970s, 120 FPS for the GTX Titan and 140 FPS for the GTX 690.

But once we look at the variance picture again we find that the GTX 690 and the GTX Titan have swapped places, with the single GPU performance of the Titan resulting in a smoother overall experience than even the GTX 690. Both NVIDIA solutions are drastically better than the HD 7990 / HD 7970s in CrossFire.

At 2560x1440 the FRAPS results look pretty similar to those above...

But once we take away the runts and drops we find the HD 7970s in CrossFire fall behind the performance of both of the NVIDIA GeForce cards.

Ouch, another blanket of color from the Radeon solution that indicates unsmooth and inconsistent frame rates! If we look just as the NVIDIA side of the equation we again see the a thinner band of color on the GTX Titan results that indicates tighter and more consistent frame times throughout the benchmark run.

Here is an individual run graph for the HD 7970s in CrossFire to help demonstrate how the runts cause the observed frame rates to be lower.

And the two runs for the GTX 690 and the GTX Titan do not indicate any runts at all...

These minimum FPS percentile charts show some pretty dramatic differences, even between the competing NVIDIA options. The HD 7970s in CrossFire average around 60 FPS, the GTX Titan at 75 FPS and the GTX 690 at 95 FPS.

But the frame variance results, our ISUs (International Stutter Units), once again prove that the single GPU solution has a more consistent and fluid frame time result with the HD 7970s in CrossFire really seperating themselves (not in a good way) starting at the 80th percentile.

Even though we only have NVIDIA results for 5760x1080 due to the extreme amount of dropped frames on the HD 7990 / HD 7970s, comparing these two options is interesting. In both FRAPS and observed average frame rates per second, the GTX 690 is showing as the faster of the two options, running faster than the Titan the entire time.

But the plot shows an interesting story - the frame times on the GTX 690 are not as consistent or as smooth as on the GTX Titan. They are averaging much lower, based on the where the bulk of the blue band resides in comparison to the green band, but the spikes that show themselves on the GTX 690 are gone completely with the GTX Titan.

If we look at only the minimum FPS marks we find the GTX 690 to be 33% faster in average frame rate over the entire run, but based on the graph above (and the one below) that isn't the whole story.

Here we see the result of all of those "spikes" in frame times - a pretty sizeable difference in frame variance going from the GTX 690 to the GTX Titan. While the Titan never has more than 2.5 ms of variance from one frame against the running average of the past 20, the GTX 690 has 8-9 ms jumps at times, which will likely cause some noticeable stutter.

There are two take aways from this first page of results. First, the AMD Radeon HD 7990 or HD 7970s in CrossFire are not going to compare well to the GTX Titan or GTX 690 in many cases because of the runt and dropped frame issues we have detailed. Second, while the GTX 690 may be "faster" than the GTX Titan in Battlefield 3, at higher resolutions and especially for multi-monitor situations, the GTX Titan looks to provide the better overall experience.

As you should have noticed also the NV Drivers where older (314.07 & 314.09 for TITAN). This is because he started a time ago to make these time-consuming tests. At that Moment the AMD Driver wasn't old. :)

Thanks Ryan for the interesting Frame Rating tests so far (and upcoming). Would it be possible the add 3-way & 4-way MGPU-Setups for an extra review in future?

I doubt it eliminates the stuttering problem as it just increases FPS to a point you can't see it anymore. nvidia doesn't have stuttering cause its what they do on the hardware to prevent it which AMD doesn't but supportable gonna make a fix for in July which would have to be software fix.

Oh, Ryan you rascal! Are you not familiar with the MO of the AMD diehards? These guys are more deluded than the Westboro baptist church.

Let me bring you up to speed.

The latest beta driver, released at 3am the night before, not available on the official AMD website, and available only through obscure links passed around by the AMD diehards, is the JESUS DRIVER THAT WILL FINALLY CHANGE EVERYTHING! It will unlock all the untold power that AMD diehards just KNEW was in their cards all along. This situation repeats EVERY SINGLE TIME a new beta driver is released, publicly or not.

To be perfectly honest you're throwing the diehards way too much of a bone by using beta drivers to begin with. You wouldn't review beta (non-production) hardware, now would you? But that's another discussion for another time.

I really am sorry Ryan for all the venom some people will spit at you now and in the future because you're the first tech journalist in quite a while to actually do some investigative journalism instead of being a press release parrot. There are a lot of people that appreciate all your hard work, please don't forget that. The real value in what you're doing is not just in evaluating current hardware and making better buying decisions, but in shaping the future of the industry. Because of your hard work, future hardware WILL run games more smoothly than they would otherwise, and for that we thank you. Thanks, Ryan.

Hope and beta driver change. Praise be the new frame rate!
The future is here and this is the driver we have been waiting for! Honor be upon Catalyst Maker, the oceans are receding and all crossfire rigs now rise with this tiding of good joy!
Banished are the runts frames, in Abu Dhabi's name.
AlluAMDahkbar!

People are getting too hung up depending on their preference. This is genuinely interesting stuff, that has't been applied by end-user review sites before. Both IHVs have resources that are beyond anyone else. I don't buy for a minute that any of the major IHVs doesn't analyze the render pipeline in great detail.

Hey Guys, these articles really confirm what I saw with a crossfired HD6950 setup, I noticed big stuttering (3dmark11 springs to mind) on a dual-card setup even though the FPS were double than the single card setup.
With one card it was much smoother even though the FPS were half.
I ended up putting one card in a drawer so that was a big waste of money.
Now running a single GTX TITAN.
I will only be running single GPU from now on so I bought the fastest single GPU...... Sorry AMD, been running you for years but you don't have the fastest GPU. Good luck anyway :)

One exception I would like to mention, I remember running Crysis Warhead with crossfire on 1920x1200, enabling VSYNC,
and being able to run at 60 FPS constantly. This setup didn't seem to be stuttering at all so it would be interesting to test crossfire with VSYNC when the game permits at least 60 FPS constantly.

Going to reply to my own comment, it seems people get really exited about this and start trolling and calling people names. Quite disappointing. Now, I'm 41 years old, maybe that's the problem :-) For all those people who are able to give their opinion without being rude to others, kudos, your comments are well appreciated.

Here here sir. This guy is 54 years old and has nothing better to do than troll this awesome tech site. If you are interested in a laugh I have started a little thing called the John D. Show. You should check it out!

You are right. I was wrong: third parties used a Lucid chip in the past for dual GPU instead of the AMD reference bridge. But the latency remained the same except for the case where the dual-GPU board was paired with at least a third card:

It also CANNOT be reproduced as the boards that are currently 7970 x 2 are NOT 7990!!!!

The 7990 is going to be based off of a CGN respin similar to the 7790 (respin). The conclusions cannot even accurately refer to the 7990 without pointing out this is an NVIDIA sponsored site, as the 7990 is not even released.

What a joke to discuss an unreleased part and not only that, reflect so negatively on it.

It also CANNOT be reproduced as the boards that are currently 7970 x 2 are NOT 7990!!!!

The 7990 is going to be based off of a CGN respin similar to the 7790 (respin). The conclusions cannot even accurately refer to the 7990 without appearing to be an NVIDIA sponsored site, as the 7990 is not even released.

What a joke to discuss an unreleased part and not only that, reflect so negatively on it.

Good read, I like the new charts. They are easy to follow. I think these are the best looking graphs I've seen on the subject, so keep up the good work.

I happen to be a 3D Vision user. I play it with about any game I have and when it doesn't work, I use a Helix mod in most cases.

Anyways, I'd be very interested in seeing how Crossfire/SLI works in 3D, be it HD3D or 3D Vision. Theoretically, I believe their results should be similar to a single card setup in variance, but much faster, though some confirmation would be nice. This is because two images are made for each frame, so each card will start a frame at the same time for delivery at the same time, while something like the Titan would have to create two separate images for each frame.

Maybe just one article on it at some point would be nice just to paint a picture. We wouldn't need a lot of them if my theory is correct.

Running Tridef 3D in passive mode instead of Shutter Glasses the strangest thing I've discovered is that the same FPS feels faster.

3dVision shutter glasses in surround with 400/500 series (it had to be SLI for surround to work with reference boards) according to simracers had stutter.
But that was fixed using .ini configs that PC racing simulators had documented for quite some time.
Those .ini configs, years later I discovered, lead to the same results as nVidia's smoothing techniques.

I second the 3d testing please. I'm running 3d vision surround with Titan SLI and would love to see frame time effects with 3d enabled. To me 3d with Lightboost is much smoother than non-Lightboost so I believe the display used will affect your perceived stutter also.

I second the 3d testing please. I'm running 3d vision surround with Titan SLI and would love to see frame time effects with 3d enabled. To me 3d with Lightboost is much smoother than non-Lightboost so I believe the display used will affect your perceived stutter also.

I second the 3d testing please. I'm running 3d vision surround with Titan SLI and would love to see frame time effects with 3d enabled. To me 3d with Lightboost is much smoother than non-Lightboost so I believe the display used will affect your perceived stutter also.

I think most of us understand that 2x 7970s aren't the same as the upcoming 7990, but we appreciate that Ryan and his team used what is available currently from AMD as a comparison to Nvidia's high end solutions.

Yes, but the article is deviously worded to portray the unreleased 7990 in a bad light. It's true the 7970x2 may be pretty bad, but that should be talked as it is, not trying to portray an unreleased card in a bad fashion!

I'm curious, have you considered using a dual socket LGA-2011 with 128GB of RAM for RAMDisk as a temporary storage for the capture system?

Obviously such write speeds aren't needed right now, but should a capture card capable of 4k@60Hz or 1600p@120Hz ever be made, then it might be cheaper to use a RAMDisk than to buy the entrepise SSDs that would be needed to capture at such speeds.

PCPER still shilling hard for nvidia. 7990 isn't even released and he is 'reviewing' it here by using two 7970s.. what a joke this site is.

Without knowing what the 7990 actually has on board or what the drivers will be like for it, he is 'reviewing' it now and using the 7990 name. Never mind that the 7990 is using two of the new Malta cores, not the Tahiti cores in the 7970.

You're a joke, dude. Shill harder for nvidia why don't you. When is the next nvidia card release where you'll once again have their PR man and yourself shilling away in your live stream (free nvidia advert)

Same guy who was releasing information using nvidia's toolset without disclosing everything was being done with the help of nvidia. While decent review sites didn't do anything until they first came out and said the tools are from nvidia.

Yeah, the 7990 is just two 7970 GPUs connected via a PCI-E 3.0 bridge chip that then communicates with the system at x16 PCI-E 3.0 speeds. It might show a small improvement in communicating with each other, but it is going to be minimal. The problems that AMD have will require a pretty hefty driver revision to smooth things out.

I was questioning that but as I discover it wouldnt make no difference.

And PCPER never backed nVidia in the articles Ive read.
I had AMD 6900 and nVidia 570 and Frame Limiter + Adaptive vSync or x-buffering eliminates stutter in the racesims I play to experience the same as with nVidia 500.

But as PCPER says: although available solution (with Radeonpro) this is NOT AMD's control panel option and is not supported by AMD.

The maker of msiafterburner and evga precision has written the fcat type color bars into the new releases.

The problem for you is amd is fail, and the new evidence is unassailable.

The last cry of the dying, lying cf breed, total annihilation is moments away.

None of us will be awaiting your certainly never forthcoming apology, I however will be enjoying your delicious tears, as the full implications of total epic CF amd fail sinks home into the thick, biased, crank of amd fanboy water carrying bloated skull.

BWAHAHAHAAAAAAAAAAAAAAAAAAAAAA !

You may buy the newly released book at online shops everywhere:

: "Death of the amd fanboy"
The lies, the fantasies, the obstinate denial, years in the stuttering darkness, and the gruesome ending when full exposure and half the frame rate ripped their living guts out. Delicious amd fanboy tears ending. Don't miss it !

Isnt that where GPUView from Microsoft comes into play. That seams to be a better tool then this is.

Nvidia FCAT grabs the information overlay at the point where Fraps take it at the beginning of the pipeline and merges it to the Frame output at the end of the pipeline.
Its supplying two pieces of information at different points of the pipeline and presenting them as one.

Thats an odd way of measuring things to say the least. Especially if your discounting what goes on in the middle.

Ryan...if your theory is correct that this problem only occurs with crossfire when the gpu is the primary bottle neck...then this should be very easy to test out just by dropping the graphics settings/resolution and seeing if it goes away? I'm sure you've already thought of doing this.

Great job Ryan! Kudos to you and PCPer for going through all this testing.

This confirms what I suspected. I have been using ATI/AMD cards in single and xfire configurations for many years and always felt there was something amiss with dual gpus, but could not pinpoint it before now.

The Frame Variance Graph: "...What this does NOT really show are the large hitches in game play seen as the spikes in frame times. Another stutter metric is going to be needed to catch and quantify them directly..."

As an example: If Average FPS (over 60 seconds) is 100, then Total Frames observed over 60 seconds is 6000.

If ONE SINGLE FRAME is above 100ms, then for the y-axis value '100' (milliseconds), the x-axis value will be '99.9983' (percentile), i.e. one minus (1/6000).

If FOUR FRAMES are above 30ms, then for the y-axis value '30' (milliseconds), the x-axis value will be '99.9933' (percentile), i.e. one minus (4/6000).

If TEN FRAMES are above 20ms, then for the y-axis value '20' (milliseconds), the x-axis value will be '99.8333' (percentile), i.e. one minus (10/6000).

Therefore, instead of PERCENTILE on the X-AXIS, you should put NUMBER OF FRAMES on the X-AXIS.

Following our example, for the y-axis value of '100' (ms), the x-axis value will be '1' (frames), for y-axis '30' (ms), the x-axis will be '4' (frames), for y-axis '100' (ms), the x-axis will be '10' (frames), and so on.

EDIT: Therefore, instead of PERCENTILE on the X-AXIS, you should put NUMBER OF FRAMES on the X-AXIS. Following our example, for the y-axis value of '100' (ms), the x-axis value will be '1' (frames), for y-axis '30' (ms), the x-axis will be '4' (frames), for y-axis '20' (ms), the x-axis will be '10' (frames), and so on.

Fantastic article Ryan. I think this will be the reference for future benchmarks and keep all GPU manufactures honest! In the podcast you mentioned having uploaded a non you tube video without the compression artifacts of flash and you tube. Is there anywhere i can download the video for the Skyrim comparison?

But this, οf сourse," you'd only be partly correct. One Missed ShotIn the beginning, it's common to miss a lot of ground to cover. Remember to give your class a reward. We get into the caves. Some people own swamp buying used cars with bad credits as a form of outdoors competition for the working line. 4mpg and if you want the wheels to look sporty, classy, or clean. Signal services are usually offered on a monthly basis.

Honestly running RadeonPro with DFC and vsync fixs most of these games. I have played sleeping dogs at 60fps and it was glass smooth with RP. Most of the CFX problems are without vsync on - which in my opinion is not an option. So i've never had these so called runt frames affecting my gaming by default. I'm playing crysis 3 atm with radeon pro DFC and vsync and its pure smoothness. Sure the current drivers are a problem for people who game without vsync but i don't see why anybody would game without vsync.

Its interesting they recommend people not to buy AMD cards just because they can't be bothered to run a 3rd party app. Try suggesting things to fix the issue rather would help gamers more.

Noticing the lying liars and their rager reactionary scunge brains borging out about the calamity of their full facepalm after YEARS of hearing them scream two amd crossfire For The Win !!!!!! does not a fanboy maketh.

As I said at Techreport, when it comes to single GPU, framerates are all that matter on most occasions, so don't be put off buying a 7870/7950/7970 because of XF results.
I have a single 7950 and it's killer

That's fine but all those years of squacking future proof and buy one amd card now because in the near future you can go with CF and have a huge boost when you need it....

BLAH BLAH BLAH BLAH BLAH is what those endless YEARS of amd justifications now mean.

I tell you one thing, I noticed a year ago how some of these websites started shying away from dual card recommendations, which of course means they all had the info and inkling on this problem with amd crossfire, and no doubt nVidia was hammering their little eardrums and deep down inside they knew - not to mention when one of the did talk, they posted how CF failed in no less than 50% of the popular games they were using in their testing suites... it became IMPOSSIBLE to recommend CF with a clear conscience, but the fanboy in them made certain they included SLI in the "do not recommend" category as well.

This is what I saw develop, and there is no doubt why it went that way. When the crap underdog card is failing miserably but the cover is still not blown, make ceratin to cut down the competition, too, that way, you never really fully recommended the failing dirty amd dog double fps lying CF setups...

It's so sad. It's so facepalm. It has made every single site that does reviews look like clueless fools, and in fact, they WERE for YEARS.
Credibility ?
Not so much, so very little. So very, very little.

Ryan, thank you so much for bringing this issue to light!
I have an Xfire-7970 setup and because of your research, AMD will have no choice but to remedy this problem (eventually).
Thank you, your hard work is greatly appreciated.

I for one would also like to see "Frame Rating" reviews done for all single GPUs, both discrete and integrated. That includes AMD APUs, Intel's HD series, Nvidia's Optimus, AMD's Endora. Laptop & Desktop both Single GPU and/or CF/SLI.

Since we can now see that Fraps numbers can be suspect and that drivers/implementation can result in runts/missing frames I want to see if any of the above mentioned systems result in lower actual FPS than the Fraps number shows.

The amd fanboys are in TOTAL DENIAL, and there are 2 posters who actually read the articles or gave themselves clue one by studying just a bit.

The ENTIRE 33 pages of comments there has not ONE, I repeat NOT ONE comment that points out in CF in the game they are moaning about that EVERY OTHER FRAME WAS ENTIRELY DROPPED FROM THE END USER SCREEN.

So there you have it - the fanboy brain wins out over all other facts, including the often tried and true massive ignorance, total lack of reading, completely ignorant BLISS accompanied by the raging radeon fanboy screed "It just can't be!"

So their tinfoil DUNCE caps are lofted upon their heads in FULL SPLENDOR. It is absolutely amazing.

It also appears that just one commenter on the entire 33 pages had any inkling that the frames presented to the gamer were ALL CAPTURED in real time, and could be gone through manually frame by frame by frame SO THAT NO OVERLAY FCAT CREATED BY NVIDIA COULD BE A BIAS ISSUE !

So expect, I'd say, about 5 or 10 YEARS before the amd fanboys finally admit runt and dropped frames ever even happened, and 5 or 10 years from now they will repeat the current refrain: " AMD driver problems are a thing of the past and the drivers are equal with nVidia, whose had problems too ! > ...link....(from 5 years ago).

Just remember people, the average human is a C grade, and half the people are DUMBER THAN THAT !

So you have to spell it out EXPLICITLY to them, directly, in simple retard friendly terms... then explain how their conspiracy theory about it's all one big lie is not actually possible, because there are things called FACTS.

AMD was DROPPING EVERY OTHER FRAME IN CROSSFIRE IN SOME GAMES. THAT MEANS THAT FRAPS FPS WAS 100% TOO HIGH! CUT IT IN HALF, YOU HAVE THE EQUIVALENT OF "JUST ONE CARD".
ADD IN THE STUTTER FROM EVERY OTHER FRAME BEING DROPPED OR TOTALLY RUNTED, AND GUESS WHAT AMD FANBOYS ?

If you are a low C, or below that, or a B, or claim to be an A student, who cares, the amd fanboy in you will win out over all the collective intelligent consciousness in the entire multiverse.

For the experienced diver - the deep sites with their strong currents and huge fish make are clear favorites.
Who needs that stupid overload of eating tools on a western restaurant
table which only serves to confuse and embarrass the uninitiated and succeeds to seriously annoy me.
When arriving there it can seem like a bit of a culture shock,
if you've never been to Asia before you will be amazed by the craziness of
the city.

I just love how many fine young crossfire combos owners come here and in such a polite way tell us story about all their many many years long unsolved issues with their expensive, totally useless gfx configs.....I don't know should I laugh when sucha nicely mannered "AMD owners" don't react in more believable way to all this bashing from green side. Amuzing masquerade. Many red masks with green smiles underneath.

Ryan,
Don’t worry about the negative and bias comments.
Thank you for this great review, it has opened my eyes to the cause of these problems. And hopefully a new way to review all Graphics cards in future, instead of just looking at the highest FPS numbers.
I have always thought smooth experience is better than a fast (high FPS) and choppy visual gameplay.
Hopefully AMD and Nvidia will consider these issues in there next GPU and or driver releases now it has been exposed, rather than targeting figures. This means a better gameplay experience for the consumer.
Thank you and Keep up the good work.

Ryan,
Don’t worry about the negative and bias comments.
Thank you for this great review, it has opened my eyes to the cause of these problems. And hopefully a new way to review all Graphics cards in future, instead of just looking at the highest FPS numbers.
I have always thought smooth experience is better than a fast (high FPS) and choppy visual gameplay.
Hopefully AMD and Nvidia will consider these issues in there next GPU and or driver releases now it has been exposed, rather than targeting figures. This means a better gameplay experience for the consumer.
Thank you and Keep up the good work.

I think that instead of the percentile curve you could reach a more meaningful result using a derived curve(of the frametime curve).
Let's say that the average is 60 fps.
Now let's say that 20 percent of the frames are 25 ms(40fps).
The difference is how these 25 ms values are spread in the curve. If they are all together or if they are alternated to 17 ms ones, forming saw-like shape in the curve.
You will not have the same feeling stutter-wise
What i want to say is that the percentile graph is not appropriate for the kind of analysis that you are doing. You should use a derived curve since deriving a function measures how quickly a curve grows (negatively or positively) and this is not measured by the percentile grows. After this you could measure the area of this curve and you could also arrive to use one only number to measure the amount of stutter.Infact in this way you would bring out of the equation the part of the frametime curve that is below the average but that runs steadily.
Calculating the area of a very saw-like derived frametime curve you would obtain a high number whereas calculating the area of a smooth (even if variating) derived frametime curve you would get a very low number. This would tell you how smooth are transitions, not if the gpu is powerful enough to make the game playable. For this you should check the average fps.
So in the end if you got decent fps and very low value for the area of this function you got a great experience,
if oyu got decent fps but high derived func area value then you got stutterish experience.
If you got low fps and low value you got a underdimensioned gpu but good smoothness.

I think that instead of the percentile curve you could reach a more meaningful result using a derived curve(of the frametime curve).
Let's say that the average is 60 fps.
Now let's say that 20 percent of the frames are 25 ms(40fps).
The difference is how these 25 ms values are spread in the curve. If they are all together or if they are alternated to 17 ms ones, forming saw-like shape in the curve.
You will not have the same feeling stutter-wise (and here i am not saying anything new)
What i want to say is that the percentile graph is not appropriate for the kind of analysis that you are doing. You should use a derived curve since deriving a function measures how quickly a curve grows (negatively or positively) and this is not measured by the percentile curve. After this you could measure the area of this curve and you could also arrive to use one only number to measure the amount of stutter.Infact in this way you would bring out of the equation the part of the frametime curve that is below the average but that runs steadily(something that with percentile curve you cant do).
Calculating the area of the derivation of a very saw-like frametime curve you would obtain a high number whereas calculating the area of the derivation ofa smooth (even if variating) frametime curve you would get a very low number. This would tell you how smooth are transitions, not if the gpu is powerful enough to make the game playable. For this you should check the average fps.
So in the end if you got decent fps and very low value for the area of this function you got a great experience,
if oyu got decent fps but high derived func area value then you got stutterish experience.
If you got low fps and low value you got a underdimensioned gpu but good smoothness.
EDITED :I made some corrections to the post i previously wrote since it is not possible to edit it

Quick Google "geforce frame metering" and you will find out why the nVi cards rarely have runt frames. In fact, nVi cards DO have them. They just delays those frames a bit to match with other good frames' speed, therefore the frame time chart looks good miraculously.