MSI GeForce GTS 250 Review

We’ve seen a revolution in graphics cards over the past couple of years. While ATI’s Radeon HD4000 cards seem to be the current darlings of the enthusiast market, it can’t be forgotten that NVIDIA started the current war with the GeForce 8800. The MSI GTS 250, at its core, is little more than a 9800 in a new dress. With an attractive price point, however, it’s still very much a valid option for gamers on a budget. Read on for our full review.

Specifications:

GPU: NVIDIA G92

Core clock: 738MHz

Memory: 1GB GDDR3 RAM @ 2200MHz

Digital output: 2 x DVI (supports HDMI with included adapter)

Analog output: VGA and composite video with included adapters

Power: 1 x 6-pin PCI Express

Warranty: 3 years limited parts and labor

The MSI N250GTS-2D1G has a suggested retail price of $159.99.

Build and DesignMSI’s take on NVIDIA’s GeForce GTS 250 results in a dual-slot card built on a bright-red circuit board. The card itself is sturdy and well-built, and the heatsink is firmly attached. You can pick the card up from just about any point of contact without worrying about snapping a part off. This may not sound surprising, but it’s nice to see after a negative experience involving the heatsink on an ATI HD4350.

The final product is an attractive video card, with the red board, gold and copper heatsink and black fan setup. Granted, many people could care less how their video card looks, but if you’ve got an open system or a large window in the side of your case, consider it a bonus. Speaking of the heatsink, it’s pretty large for what is, at its core, no more than a midrange card. The combination of the large heatsink and dual-slot construction with the fan means that the card runs pretty cool at low fan speeds. Gamers have for a long time put up with loud fans and hot cards as a consequence of being able to play the newest games. Manufacturers are finally coming around to the fact just because consumers want acceptable video performance doesn’t mean that they want to listen to computers that sound like fighter jets.

The rear (front?) of the card shows off the MSI branding as well as two DVI-I inputs. While the card only offers DVI on its own, MSI includes a number of connectors to let you use whatever display you want, whether it takes DVI, HDMI, VGA or even composite video. The latter might come in handy if you want to hook up an older standard definition television.

One other thing to mention here is that you may see a huge difference between the 512MB GTS 250 and 1GB GTS 250 versions. This is because for the 1GB variants, NVIDIA mandated board changes that are optional for the 512MB cards. As a result of the new design, the 1GB GTS 250 has a significantly shorter (9 inches vs 10.5 inches) length than the older 9800GTX+ or 512MB GTS 250s that don’t choose to use the new design. Moreover, many 512MB GTS 250s are simply rebadged 9800 cards with zero changes aside from labels and firmware.

PerformanceThe Maw. Great indie fun, and less than ten bucks on Steam.

Before we get into the nitty-gritty of numbers and charts, let’s take a closer look at the underlying technology of the new GTS 250. As mentioned earlier, NVIDIA really started this revolution in graphics cards a couple of years ago with the 8800, which offered up a pretty high performance-to-price ratio. There’s a reason that the GeForce 8800 commands over 11% of the video card share of all Steam users.

The original card came in 320 and 640 megabyte G80 GPU versions, but was quickly updated with the G92 core. The G92 core proceeded to star in several powerful 8800 variants, followed with the GeForce 9800 and now the GTS 250. NVIDIA is definitely getting their money’s worth out of this GPU core. You might say, therefore, that the GTS 250 is really just a jazzed up 9800GTX and you’d largely be right, especially for 512MB versions. We talked about the differences in the 1GB version in the design section earlier, but it bears mentioning here, too. While performance of the 9800 and the 512MB GTS 250 will probably be very, very similar, the 1GB GTS 250 will likely pull ahead at higher resolutions (we’re talking 2560x1600 and maybe 1920x1200, here) thanks to its extra 512MB of GDDR3.

Our test system for the following benchmarks is an AMD-based machine utilizing a Phenom II 940 running at 3GHz, 4GB of DDR2-1066 Kingston RAM, an ASUS M3A79-t Deluxe 790FX motherboard, and the MSI N250GTS-2D1G GeForce GTS 250 1GB video card. None of the components were overclocked in these tests.

3DMark Vantage results:

Category

3DMark Vantage score

Overall score

P8151

GPU subscore

6523

CPU subscore

32480

The GTS 250 also managed a score of 14,042 3DMarks in 3DMark06. While we didn't have any of ATI's Radeon HD4850 cards in house to test along with the GTS 250, we can look at results from previous systems to get an idea of how the two cards might compare. It bears mentioning since the HD4850 is the obvious competitor to the GTS 250 in the marketplace. An Intel Core i7 920-based system, replete with an HD4850, managed to get 13,085 3DMarks in 3DMark06 and a GPU subscore of 6407. The HD4850, however, only came with 512MB of GDDR3 RAM.

Bioshock results:

Settings

Minimum framerate

Maximum framerate

Average framerate

1680x1050, settings maxed

66 fps

192 fps

103.8 fps

1920x1200, settings maxed

61 fps

145 fps

84.9 fps

Bioshock is definitely started to show its age at this point, but given the fact that it's one of the most popular PC games in recent years, it still bears a mention. Given its age, too, it runs like butter on the GTS 250, even at high resolutions with settings maxed. This card might not run every new game with every setting on high, but you can definitely expect to run favorite older games at whever resolution you want.

Call of Duty: World at War results (4x AA/AF):

Settings

Minimum framerate

Maximum framerate

Average framerate

1680x1050, settings on auto

40 fps

82 fps

62.6 fps

1920x1200, settings on auto

38 fps

68 fps

53.6 fps

The fifth game in the Call of Duty series, Call of Duty: World at War has sold millions of copies in the four months since its release. The first person WWII simulator offers players a pretty grim but highly detailed view of a solider in the war. The GTS 250 doesn't have any problem rendering the content even at a resolution of 1920x1200 (meaning it wouldn't have any problem displaying at the rapidly approaching 16:9 standard of 1920x1080). The minimum framerate dropped down to 38 fps at 1920x1200, which is close to what we consider the minimum playable rate of 30 frames per second. Granted, this was with all anti-aliasing and anisotropic filtering turned off, so you might need to decide whether max detail, AA/AF, or a high resolution is most important to you when playing. (Update: originally this table listed settings with 0xAA and settings maxed. It has been corrected to reflect the settings under which the system was tested).

Left 4 Deadresults (8xAA/no AF):

Settings

Minimum framerate

Maximum framerate

Average framerate

1680x1050, settings maxed

48 fps

135 fps

87.1 fps

1920x1200, settings maxed

34 fps

103 fps

68.7 fps

Similar to Call of Duty, Left 4 Dead isn't much of an issue for the GTS 250, though at 1920x1200, the minimum framerate does get dangerously close to what we think is minimally playable. With the average framerates close to 70 and 90 frames per second for 1920x1200 and 1680x1050, respectively, the game plays very smoothly, so you probably won't have any issues unless you have a 30-inch display and want to game at 2560x1600.

Crysis v1.2 results (no AA/AF):

Settings

Minimum framerate

Maximum framerate

Average framerate

1920x1200, all settings medium

18 fps

60 fps

39.5 fps

1680x1050, all settings high

17 fps

46 fps

29.1 fps

1920x1200, all settings high

14 fps

39 fps

26.4 fps

If you're planning on playing Crysis at max settings, look elsewhere. The GTS 250 is simply not going to cut it. Even when you tone the details back to high, the average framerate drops below 30 frames per second which really isn't going to be satisfactory, especially since you'll see frame rates drop to 14 frames per second at 1920x1200 when things get really hectic. 1680x1050 isn't much of an improvement, either, with framerates still below 30 fps. It isn't until you drop settings down to medium that you can game effectively at higher resolutions, with an average framerate of 39.5 fps. Even at that setting, you can expect some serious frame drops down to 18 frames per second in sections of the game, but overall the gameplay is smooth.

Crysis Warhead results (no AA/AF):

Settings

Minimum framerate

Maximum framerate

Average framerate

1920x1200, all settings mainstream

20 fps

55 fps

40.7 fps

1680x1050, all settings gamer

14 fps

44 fps

29.8 fps

1920x1200, all settings gamer

12 fps

35 fps

24.8 fps

Crysis Warhead fares almost exactly like Crysis did, with some results slightly lower and some results slightly higher. The settings used were the same as in Crysis; the developers saw fit to call medium settings "mainstream" in the follow-up game, while high settings were renamed "gamer". The same advice follows; if you want to play this game with this card, you're going to having sacrifice details and sometimes resolution to do it.

Power, Heat and NoiseThe TDP for this card is only 150 watts, which allows it to get away with only using a single PCI Express power cord. Our system at idle used around 135 watts, and maxing out the graphics card on its own pushed power use up to 246 watts, which is completely respectable and in fact laudable considering the performance. As it doesn't use excessive amounts of power, it doesn't generate too much heat, either. At idle the card ran at 32°C and when under heavy load for 20 minutes it only crept up to 61°C. Since it doesn't run too hot, the fans on the card don't spin up as loud as on some recent cards. It's almost a relief to be able to play a few games without listening to the fans on a hot card run loud enough to be a bother.

ConclusionNVIDIA's G92 GPU core could really be called the chip that wouldn't die, since this thing keeps coming back and coming back, and with results like these, you can't really fault NVIDIA for doing it, especially since it holds its own pretty well against the competition's Radeon HD4850. At $160 it does seem a little pricey, though already stores are offering sales, with Amazon shipping the overclocked 512MB card for $143 and Newegg offering the 1GB at $160 but providing a $15 mail-in-rebate as well as a full copy of Call of Duty: World at War with the card.

Overall the GTS 250 delivered acceptable performance at high resolutions, but you're going to be making trade-offs; you won't get the highest details and settings at very high resolutions on the latest games. You will get decent results, however, without putting too much of a strain on your power bill or your wallet. You'll also be able to avoid the really loud fans found on some new cards. In the end, MSI's 1GB GeForce GTS 250 offers good, but not great, performance at an acceptable price point. While the HD4850 has been such a star lately, NVIDIA is showing that they have teeth, and if you find a good deal, the GTS 250 might be your next card.