eVGA e-GeForce4 MX 440 Review

Share.

By Matt Hanyok

If you've asked about upgrading your video card on our message boards or many other places where those in the know can answer questions, you've probably heard something like this: "Don't buy a GeForce 4 MX card. It's not a real GeForce 4."

The GeForce 4 MX series was introduced at the same time as the GeForce 4 Ti series, back in February. (covered in the same preview article as the Ti series). I stand amongst the many that are confused as to NVIDIA's choice of calling this chip a GeForce "4" when it is in reality closer to a GeForce 2 than a GeForce 3. I'll get into that later.

The MX 440 is the middle-end of the GF4 MX series, the GF4 MX 460 has yet to see the light of day on retail shelves. eVGA's MX440 card comes equipped with 64MB of 200MHz DDR memory, a core clock speed of 270MHz, and VGA and S-Video outputs. It also includes their "ACS" -- Asymmetric Cooling System -- which looks kind of cool.

As you'll see in a second, the heatsink is very odd looking. But consider the design: the fan is not located directly over the chip, which means the air it intakes will be somewhat cooler than if it had been sitting right on top. Plus it is able to use the entire inner surface of the enclosed heatsink for heat dissipation and the exhaust air is directed away from the video card, out past the memory chips, so that keeps a nice airflow over them. Also, it's hard to tell from the picture, but the heatsink isn't very tall, the card itself won't block your first PCI slot. eVGA has more information on the ACS available here.

This is another of the many video cards currently available in the sub-$200 price range competing for your attention. Competition in the 'budget' video card arena is pretty stiff right now, let's see how the GF4 MX stacks up.

The Test System

AMD Athlon XP 1800+ (1.53GHz)

NVIDIA nForce 420-D reference motherboard, nForce system drivers 1.03

512MB Corsair XMS2400 DDR memory

ATi Radeon 8500 64MB

Elsa Gladiac 920 (GeForce 3)

NVIDIA GeForce 4 Ti4200 reference board

eVGA e-GeForce4 MX 440

Windows XP pro, Direct X 8.1

The Radeon card was tested using beta drivers version 6052, the NVIDIA cards tested using the latest official Detonator release, version 28.32 (available at NVIDIA's website). Anti-Aliasing testing utilized 2X Quality Smoothvision for the Radeon, and Quincunx for the GeForce cards. The GeForce 4 MX series has the Accuview AA engine present in the GeForce 4 Ti chips, so the Quincunx mode on that card is of excellent image quality compared to the GeForce 3.

Though these tests will show the major fault of the GeForce 4 MX series, it's worth mentioning here to make sure everyone understands. As I said above, the GF4MX cards are close to GeForce 2 in design. Take all the functionality of GeForce 2 MX, crank up the clockspeeds, throw in Accuview and LMA-II, and you've got (basically) a GeForce 4 MX. So not only do these cards lack any shader capabilities, they can't even complete the environment bump mapping tests of 3D Mark. Yes, the name is misleading.

3D Mark 2000 and 2001 scores are kind of surprising. The Anti-Aliasing test for 3D Mark 2000 shows a very large performance drop for the MX440, on the scale of about 40% (compared to the 10% or so of the Ti 4200). The extra 100MHz of memory speed on the Ti4200 combined with the Ti4200's newer architecture could make a big difference there. The 3D mark 2001 scores are no surprise; GeForce 4 MX chips don't have the basic DirectX 8 functionality offered by the GeForce 3 series. Without being able to run the 'nature' test it gets a much lower score.

GL Excess scores are a bit better; since the test doesn't depend on the existence of DX8 hardware the high clock speeds of the GeForce 4 MX show up here. Not bad for what is essentially a GeForce 2 MX, but the again, NVIDIA's OpenGL support has traditionally been excellent.

NVIDIA Chameleon Mark requires support for DirectX 8 shaders. The test did run on the MX440, however no texturing was applied to the lizard model, so the scores weren't comparable.

Ah, Wolfenstein. I've been playing a lot of the multiplayer mode recently and enjoying it. The benchmark demo I'm using for these tests runs in the multiplayer mode. (The demo was created by and is available on 3D Center). You can see that at 800x600 all the video cards are within a few frames of each other, showing that if anything, the video card is not the limiting factor there. However, once you start turning up the resolution you see the effects of the cards kicking in: the MX440 has a lower memory clock speed so it hurts pretty badly; the GeForce 3 performs better probably due to higher memory speed and the newer architecture of the chip; the Radeon has a very high memory clock speed but isn't as fast as the GeForce 4 Ti4200 which, through a combination of newer chip architecture, LMA-II and high speed memory has a score twice that of the MX card.

Numbers from Sam's second outing are pretty self-explanatory. With all the details maxed out, the MX440 has a hard time maintaining high framerates. With Anti-Aliasing enabled, the effects of slower memory and GeForce 2 architecture kick in, with the framerate dropping by over 30% (compared to the Ti4200, with it's higher memory speed and newer architecture, which only drops about 15% with QQAA enabled.)

The Comanche 4 benchmark demo is an excellent test of video card capabilities. A nice capability of it is that if a video card has hardware support for shaders it will make use of them. If, however, the hardware doesn't support those features then it uses multitexturing and some other stuff to get similar graphical effects to happen in the game. The MX440 performs pretty nicely here, falling in between the GeForce 3 and Radeon 8500. Interesting is that it outscores the Radeon 8500 in 1600x1200; perhaps that big framerate drop is a driver issue with the Radeon.

I will admit to my surprise. The GeForce 3 enhanced version of Giants: Citizen Kabuto, as I mentioned in the Ti4200 preview, uses DirectX 8 shader functions for some effects and has much higher polygon counts than the retail version. I fully expected the test to not run, but to my surprise it ran quite well. Instead of displaying an error message and not working, the benchmark played through fine (and the game runs nicely as well) but all of the effects that use shaders didn't work. (Water for example, was a flat-looking texture instead of the wavy effects present when played on a DX8 card).

Anyways, as you can see, the MX440 scored slightly lower than the Radeon 8500 on all of these tests with the exception of the highest resolution. (The Radeon 8500 wouldn't complete the test at that setting, the MX440 did). Enabling Anti-Aliasing dropped the MX440 to the definite bottom, however.

eVGA has included PowerDVD XP DVD playback software and a driver CD with a bunch of demos on it. There's also an actual manual included (!) but otherwise the package is pretty devoid of extras. Speaking of DVD playback software, the GF4 MX series includes NVIDIA's "VPE" or Video Processing Engine for DVD playback. With hardware acceleration enabled in PowerDVD, however, DVDs wouldn't run, instead displaying an error message. NVIDIA had announced their own DVD player program a while back which presumably would work with this feature, but I guess it isn't available yet.

The only really neat included program is something eVGA is including with all of their video cards now, something they call "ADM" or "Automated Driver Management." The program determines what motherboard chipset you have and installs the appropriate AGP drivers for it if necessary. Then, to quote eVGA:

ADM¿ searches your computer for previously installed video drivers and removes them and set back to the standard VGA mode. This eliminates problems that can be caused by having older video drivers present when you upgrade. Our ADM¿ then copies over the correct display drivers and configures Windows to accept the new display device - virtually eliminating any confusion about driver installation.

It's very useful if you don't like to be bothered with driver installs and uninstalls, and works pretty well although you need to reboot a few times during the process.

A good question to ask would be "how long would something like this be useful?" This is an interesting question, because games are available now that are making use of shaders to some extent -- Comanche 4 obviously, and I understand that both Wolfenstein and Medal of Honor use them in some manner if they're available -- so what happens when games start acting like the Codecreatures benchmark and require these functions? Simply put, the card won't work with titles that require these features to work. But at the same time, I'm not so sure the lifespan of a video card like this matters. Why? Because of Radeon 8500 and GeForce 4 Ti4200.

You might say, "But it's an unfair comparison, putting a GeForce 4 MX card up against a bunch of DirectX 8 cards," and as far as the technology is concerned you'd be right. But then you factor in the price and you'll see that for what you'd pay, the GeForce 4 MX 440 is not exactly a good deal. It is right around the same spot as 64MB cards based on the Radeon 8500 or 8500LE chipset and, once they're shortly made available, GeForce 4 Ti4200 cards. To put it briefly, for $20 more than you'd pay at retail for an MX440 you could get a Ti4200 card (in another week or two) which, according to my tests, not only supports the DirectX 8 hardware features, but is more than twice as fast in some occasions and especially with Anti-Aliasing. You could get a Hercules FDX 8500LE, with the Radeon 8500 chipset, for only $10 more, and it will still run circles around an MX440.

With that in mind it makes it really hard for me to recommend a video card based on the GeForce 4 MX chip. It doesn't matter how long it would last as a viable option because you can buy better video cards -- which, by nature of supporting more features, will have longer life spans -- for about the same price. Maybe if the prices on these cards were lower (much lower, like say, half of what they currently are) it'd be a different story Sadly, the prices are not lower, so the story does not change.