MSI R7970 Lightning Review: AMD's HD 7970 Gets the Treatment

Test Setup and Results: 3DMark 2011

Test Setup

I am again going against the grain here by testing this card with an AMD processor. While at ultra-high resolutions this will not make much of a difference, we do see some less than stellar performance at 1680x1050. Above that it all scales pretty much the same as when using a high end Intel processor.

With the GPU Reactor uninstalled we see the usual setup. The card will run perfectly fine without the GPU Reactor in place.

There are pros and cons to this type of testing, but at the very least it will help those who are undecided about what card to purchase when they are not running the world’s fastest processor and playing games at ultra-high resolutions. If they are playing at 1680x1050 with an AMD Phenom X4, and they can get very similar performance with a (newly and relatively cheap) GTX 580 as compared to the HD 7970, then they have their answer.

For comparative testing I used the previous two Lightning parts. The R6970 Lightning and the N580GTX Xtreme Edition Lightning are the products in question. I also threw in one of the last HD 6950 cards to be released, the 1GB HD 6950 Twin Frozr III from MSI as well. The Catalyst 12.2 WHQL drivers were used for the AMD cards, including the HD 7970. The NVIDIA 295.73 WHQL drivers were used for the comparative NVIDIA part.

Three cards and two generations of Lightning products. Top to bottom: R7970, N580GTX, and the R6970.

The Silverstone Raven RV02 case was used in testing. This is a well ventilated case which should provide a real-world cooling experience for the cards used. I believe this gives more accurate results rather than an open air testbed where airflow is decidedly non-standard (and sometimes non-existent).

The previous two cards did not hurt for outputs, but the R7970 puts them to shame. Sort of.

Results

3DMark 11

The latest 3D mark features some heavy DX11 content with plenty of tessellation to make most video cards cry. The Performance preset was used.

There are a couple of things to remember here. All of the Lightning cards are overclocked, usually quite significantly from the reference speed. The other thing is that the N580GTX is a 3 GB card, just like the R7970 being reviewed here today. If this card was compared against the stock versions of the other cards, the differences in performance are much more dramatic. As such, the HD 7970 outpaces the competition to a great degree on these tests. Add in the fact that at the LN2 setting were enabled, using the Normal BIOS mode will add a couple % more performance throughout.

Gotta work with what I have. The performance of a standard HD 7970 is not exactly a secret, so I decided to test it against the two previous Lightning cards to really detail what a user gets when upgrading to this particular overclocked monster. In hindsight I guess it would have probably behooved me to lower the clocks on this card to standard settings and gone from there. I will certainly keep that in mind next time I test an overclocked product like this. Also, Ryan is in Kentucky with the standard HD 7970s, and I live in Wyoming. Swapping parts between the two areas is a bit troublesome.

For the $50 increase in price over a stock? Yes, absolutely. But you must remember that this is a brand new product, and the GTX 680 is still not out in force. Once that happens, then I am sure the dynamics of the pricing of these cards will change drastically. I am judging this card by what is available today. So yes, at $599 it is a good card. Two months from now, when there are many different examples of not just HD 7970 cards, but also GTX 680... it might not look like such a nice product at that price. I am pretty sure though that prices will drop pretty dramatically during that time to keep it competitive with other offerings.

It most certainly is not a reference PCB. A reference board has a 5+1 power phase setup (iirc), while this one is 17 total phases. If you look at the pictures of the boards from behind, you can see that the PCB is smaller at the front of the card (display outputs), then gets taller after the CrossFire connectors. It is also longer than the reference design. This is a much larger PCB to accomodate the more power phases, as well as give the necessary room to optimize trace pathways to the different components.

Nope, the new GTX 680 can output to 4 monitors in total with one card. It will only do 3 monitors in NVIDIA Surround with the 4th being an "accessory monitor" when using 3D applications. So, users no longer require 2 NVIDIA video cards in SLI for more than 2 monitors.

Obviously I haven't been given a timeline for the eventual GTX 680 Lightning card, but I would expect it to be around 3 months away due to the shortage of chips and the design time for the product as a whole.

GTX 680 chips are scarce. TSMC shut down the 28 nm line in mid-February, and I am unsure if they have started it back up again. NVIDIA got a couple of complete chip shipments from them, but I think that until manufacturing starts up again, supply is going to be super tight. So tight that guys like Asus, MSI, and others will not have the amount of product on hand to create a second line of non-reference cards.

Well, the long and short of it is... NVIDIA set a date for release assuming that TSMC would be able to continue to process wafers at a certain rate until that date. TSMC dropped all production after NVIDIA had set the release date. NVIDIA had enough product out to release the card and have some decent numbers in retail, but after that it would be touch and go. I have heard that the beginning of April will have more cards available than at launch, but the big question is availability after that. I guess time will tell, but from what I am hearing availability might be scarce for a while.

Remember those special Phenom IIs that were aimed at the LN2 crowd and only like 1000 of them were made? This seems to be along the same line of thought. Leakier, hotter running chips that take super cooling really well. On air cooling, not an impressive overclock... on LN2, the sky is the limit. So yeah, I would imagine my sample might do well under LN2.

It's wierd. I was looking at it tismorning. Stuff at around 65%, with 1175 and 1250 voltages. Got up to around 1250-1300, some on air, some on WC, but everything was drastically varied.

It just seems odd that these things with the same "quality" don't have the same characteristics. But then again, when you have 40+% leakage, it is a lot of heat in such a small area. I know for me I have to re-wire my case and move the HDD up to the top to free up the bottom two 1200 intakes for the GPU. Working on that this weekend, but yeah.

Thanks the insight man. I'll throw up something and try to OC for sure.

If I could add voltage to the memory then I would be able to up that as well, but right now It's locked? I read a post that said to mess with the powertune settings to 20%, but the memory voltage didn't change. Perhaps I need to switch to the LN2 bios selection?

EDIT: Had to back things off a bit, way too much voltage I'm guessing, but ended with 1175/1435.

Surprised the memory doesn't go any higher than that for you. Also, put power tune to + 20%. That essentially increases the amount of available power to the GPU. Powertune was put in place so in corner cases like Furmark, the board/chip would not exceed the rated TDP (and shut down).

I got it up to 1220/1520 The memory is about done. Even 5 MHz and I get corruption. I don't know whether to add a ton more voltage (I have slider space for 75 mV). I tried 1240 on the core, but got some severe image corruption.

I think I will try some stability testing, just let it run for an hour or so instead of 5-10 loops on the metro benches, and see if anything ends up happenings in terms of corruption. As far as voltage goes, I'm not sure if there is a "less is more" type approach, or if you simply add more when you reach corruption.

From my Computer Engineering background I know a bit about how the ripple and stuff affects everything, and I'm not sure just how much ripply is being introduced and so forth, but, needless to say...

Wow, not a bad overclock at all. I had overclocked the one I have to 1100 MHz... but I was using Oblivion to play, so I don't know if the vid card was causing problems or that rather unstable game was...