Posted
by
samzenpus
on Friday January 25, 2013 @12:34AM
from the making-the-best-of-it dept.

MojoKid writes "New video card launches from AMD and NVIDIA are almost always reviewed on hardware less than 12 months old. That's not an arbitrary decision — it helps reviewers make certain that GPU performance isn't held back by older CPUs and can be particularly important when evaluating the impact of new interfaces or bus designs. That said, an equally interesting perspective might be to compare the performance impact of upgrading a graphics card in an older system that doesn't have access to the substantial performance gains of integrated memory controllers, high speed DDR3 memory, deep multithreading or internal serial links. As it turns out, even using a midrange graphics card like a GeForce GTX 660, substantial gains up to 150 percent can be achieved without the need for a complete system overhaul."

AGP bridges suck.
PCI-E DDR2 rigs aren't even that old or even considered "obsolete" either.

You obviously didn't read the article. They tested whether there was any benefit to upgrading the graphics card, and the figures show that there is. They didn't use an AGP motherboard. And it doesn't matter whether you call the system old or not, because the topic was whether you could improve a 5 year old system.

The answer is yes. It doesn't matter what your theory says, because in practice you can extend the life of an old system with a single hardware upgrade.

The thing is, most serious gamers willing to plunk down $400 for a video card aren't going to skimp on upgrading the rest of the computer. That's why nobody reviews it: Because you, McThrifty, aren't the target market and nobody's going to send you free hardware to test since your readers are, well... cheap.

Most of those hardware reviews you see online get the newest video cards for free specifically because their reviews are tailored to the guy who has a McDuck-sized vault of cash ready to be spent getting that extra.8 FPS out of Crysis.

The thing is, most serious gamers willing to plunk down $400 for a video card aren't going to skimp on upgrading the rest of the computer.

And a GTX 660 is not a $400 card, it's more like $200.

The real issue is that most games are designed to run on consoles with their ultra-crappy CPUs, so they do very little on the CPU even on a PC. I've rarely seen my i7 go over 20% CPU usage in any game I've played in Windows with the CPU monitor running.

Maybe they do, maybe they don't. 1000W power supplies have been available for a very long time.

My desktop system has a 750W Cooler Master power supply at the moment, and I'm using maybe half of its capacity under load. That's for an i5 2500k overclocked @ 4.8GHz, a Hyper 212+ heatsink*, 16GB of RAM (4x4GB), a Radeon 6970 graphics card, a DVD burner, a 60GB Intel 520 SSD as cache, and a 3TB mechanical drive, on a Z68 motherboard. I could buy a video card with a 500W draw and still have some juice left over.

Hardly. Gamers rarely stress even a 400w supply, and even 5 years ago 500-550watt were the most common PSUs to be had. In fact, in the last 5 years, the only reason why I needed to upgrade my PSU was because mine lacked the power connectors for my HD6950. That is, mostly bad luck.

To be fair, at 25 years old and over 200 games bought on steam I think I fit the target market for PC games pretty squarely, and I just upgraded my 8800 GTS to a GTX550Ti on my computer that is around 6 years old.I went from needing to run at medium/low settings at 1080 to being able to run just about everything maxed out at 1920x1200 for about $120.

To be fair, at 25 years old and over 200 games bought on steam I think I fit the target market for PC games pretty squarely

You're in the target market for PC games, but having a six your old computer - you're probably not in the target for high performance PC hardware. There's a great deal of overlap between the two, but they are not identical sets.

After five years, it's really not that expensive to upgrade a lot of stuff if your mobo was cutting edge at the time you bought it. Memory is now cheap and very useful - no slowdowns with lots of tabs and VMs open. A new CPU is probably 4 times faster than your old one, and also inexpensive. It won't be top of the line, but it may hold its own with a mid-range solution. And you've probably already bought an SSD for a boot drive because it was stupid not to.

Is Ric Romero posting stuff to Slashdot?
"Upgrading the largest bottleneck for game performance can substantially improve your playing experience!"
Whether or not it's worth doing is another matter, but anyone who's built their own computer or even reads websites like tom's hardware or benchmarking sites knows this.

The question was whether it improved it enough to be a viable upgrade. And the answer is yes, assuming the CPU in the system is a quad core or better. Dual core, no. Luckily people stopped buying those around 2007 (e8400)

Thats because the 8800 GT was only marginally slower than the fastest card that you could buy when it was released (which was the 8800 Ultra.)

The 8800 GT falls flat at higher resolutions with complex pixel shaders, where a new comparable card like the 650 GT (for $100) would not but is just as weak at actually texture lookups. The upshot is that for older games that heavily abuse multi-texturing, these two cards are almost exactly equal but for newer games that just go ahead and compute stuff every frame

But the fastest ATI contemporary isn't still usable. Compare the Radeon X1950 XT, released just weeks later - roughly the same caliber of performance at launch, but the 8800 still supports almost every game, while the X1950 will flat-out refuse to run stuff that's too new. I know - I have an X1900 XT.

The main secret behind the 8800's longevity is that it was the first "modern" graphics card, which ironically enough means it doesn't, at the hardware level, do "graphics". It's all shader cores doing rendering

That doesn't really return anything stating your point. Any bias doesn't change the fact that an i3 will keep up with a bulldozer for many games. It's only when something can make sure of more than the two cores that you'll see a difference.

AMD used to own the low-mid range, but Intel came out of their slump and have been delivering great price/performance parts.

Some games hit the CPU much heavier these days than they used to. Many games really don't perform well if they aren't given multi-core CPUs with reasonable speed.

So how much upgrading a given component makes a difference depends on what else you have in your computer. If your system has a CPU that was top of the line 5 years ago, but an integrated GPU, then ya a new GPU will probably be the best use of money. However if the CPU is underpowered, then a new GPU will do little if anything.

Back in the P3 days I recommended a discrete GPU to everyone because the integrated ones were that bad. Now with Sandy/Ivy Bridge they are quite good. You can game on them, even new games. No they don't do as well as a discrete GPU, but they really are more powerful than you might think.

Hmmm, my research from a few months ago suggested otherwise, at least to some extent. My home desktop is from 2008; it had a GeForce 8800 GTS in it which unfortunately decided to go kaput. The timing was kind of bad because

I've gone through many of the same thought processes as you, and come to many of the same conclusions.

Here's what I've gleaned:

1. A five-year-old video card (or a pair of them) should be trivially-cheap to replace with an efficient and modern equivalent, but it's not.

2. The prettiest games I want to play today bog my Q6600 CPU more than my video cards, which just loaf along on such titles.

3. I need more RAM. 4GB isn't enough and DDR2 is fucking expensive. A motherboard+CPU sidegrade is damn near free with 2x4GB DDR3, compared to 2x4GB of DDR2 by itself. And getting a significantly faster CPU at the same time isn't significantly more expensive.

4. Integrated graphics, no matter the claims by people who say they're quite good enough, suck in comparison to even quite old dedicated hardware.

Did you do this upgrade yet? It would be trivial to run off integrated graphics, and then test again with the GPU(s). Since you were running SLI, you should get better performance. But it would be interesting to compare a single 9800GT to a 4000HD.

Some games hit the CPU much heavier these days than they used to. Many games really don't perform well if they aren't given multi-core CPUs with reasonable speed.

One thing to bear in mind with gaming benchmarks - they are performed running just the game, to keep everything else equal. In real world use it's nice to have the flexibility not to have to close down your browser and other applications, especially if you aren't the only user logged into the system. For that reason, you want more cores than you ne

EVERY component is relevant to gaming performance: HD/SSD, RAM, CPU, and GPU are all important, especially with some of the latest games. And you need to get enough juice from the power supply (without immediately killing it), you have to be able to keep it all cool, you want a motherboard that isn't itself a bottleneck or otherwise a hindrance, and of course you don't want to watch the action on a 17" CRT. So while I wouldn't recommend relying on integrated graphics or a $50 card, you can't forget about a

I have a computer several years old I upgraded to SSD because the mechanical drives were failing. I saw a significant improvement in gaming after the SSD swap: with FPS games, previously I had to turn most of the visual effects off because the video was rather choppy. Now they're all left on and the game still runs just fine.

SSD's don't cure everything, nor do they speed everything up (some stuff takes just as long, because it's set to take X amount of time anyway). But for a lot of things the instant-re

Not sure about anyone else - but on my Ubuntu systems SSD made a significant difference to more than just startup times - Web Browsing for example is much snappier - im guessing this is due to the drive more readily capable of writing image / web page data and fetching from disk cache for rendering etc.

Any "runtime" performance that relies heavily on disk based caches will see a benefit here. I've used SSD for I/O intensive applications such as runningSolr / Lucene search engine the improvement here is also

Perhaps it depends on the game. Counter Strike gives an advantage to those that load the map the quickest. Being able to get to the bottom of the ramp in Dust2 to counter snipe the inevitable sniper is huge. Just sayin.

A newer/better GPU can indeed improve the graphics and gaming performance of an older computer, but it won't make it perform like a newer machine with other superior hardware. Duh.

It seems like perfect common sense, but obviously not everyone gets it so I'll state it like this: If you took a shiny 2012 BMW V8 engine and plopped it into your rusty 1982 BMW 733i, your car would be faster and more fuel efficient (assuming you could even mount the new motor and get everything hooked up), but it wouldn't autom

They pair very low-end AMD CPU with best GPU on the market at the time. Results: the CPU does affect the performance. No suprises there..

The quoted test was probably more limited by the PCIe slot running at x4 instead of x16 rather than the low spec CPU. If you can't get data to the graphics card quickly enough then even the fastest CPU will be hampered.

If card A has a performance of x (which I'll define as 1) and card B a performance of x+2, wouldn't that mean it's two times better?

The article keeps saying three times better, but wouldn't the correct way to phrase that be "It's three times as good?"

Similar things with percentages. If something has 200% the value of something else, it's twice as valuable and not two times more valuable, right?

I notice similar things in German, which is my main language. Am I just a grammar Nazi (badum-tis) or does that bother you too?

Yeah, since when you talk about how much BETTER something is than something else, you are describing the difference between the two things. So if card A has performance 1 and card B performance 3, then the difference is 2. So card B is 2 better (or, in this case, even "two times (the performance of A)" better). On the other hand, if you talk about how something is "x times as good", you are looking at the whole thing, not just the difference, so in this case, B would be 3 times as good.

I'm running a new-ish HD5970 card on a five-year-old Intel Core 2 Duo, 2.66GHz. Over those years I've also added an SSD (boot + apps), some extra hard drives, and an extra monitor. The machine is very reliable and quick enough that I really don't need to upgrade. Although I definitely will upgrade this year; five years is really old for a PC.

But I can emphatically say: "no". Excuse my grammar. I have a P4 HT clocking around 3.4ghz. I would overclock it if the mobo wasn't an HP OEM and what-have-you. I also have a 1Gb Nvidia card. My point is that my computer meets the minimum specs for prettymuch every modern game sans the CPU. You simply can't run a newer game on a single core machine without serious gameplay consequences. If I had even a core 2 duo, the rest of my setup would beat the shit out of games like crysis. So, once again: a new GPU w

I had an Intel Q6600 system (quad 2.4Ghz cores), and it wasn't able to keep up with some new updates in games my son wanted to pay. Bought a new GPU, and now I can play what he wanted to play (WoW) at maximum settings, no problem. Your mileage may vary, but it worked for me.

I think one of THE biggest bottlenecks for any computer is insufficient amount of RAM in the computer.

That's why I've always suggested that if you can afford it, install the maximum RAM allowed by the motherboard. Most motherboards that support CPU's with x86-64 instructions can support 8 GB of RAM, and with 8 GB of RAM, the performance improvement can be quite high since 1) you no longer need to use the hard disk as virtual memory and 2) programs have more "breathing room" to run.

Second, the OS and apps obviously need RAM, but to base your opinion on the vast improvement between 512 Meg and 2,048 Meg on an XP box is kinda pointless. Two Gigs of RAM was the sweet spot for XP and typical desktop use. Four gigs made the machine more responsive, but the cost typically didn't justify upgrade to 4 Gigs.

Windows 7 really runs well with 4 Gigs, and when the next 4 Gigs only costs another $20 why not go to 8 Gigs - but that's on a modern deskto

My gaming machine is actually 5 years old - built in 2008, it originally housed a Geforce 9800GTX. The CPU is an intel Quad core Q9300 - pretty low end at the time, plus 4GB of DDR2 RAM - very old, very out of date.

On that machine, I could play the likes of BF3 on low settings reasonably well. I swapped the graphics card for a Geforce 560 Ti and now I can play BF3 on med/high at 1920x1200. Nothing else has changed, same old CPU, same DDR2 RAM.

This varies on CPU in use. If the machine is not CPU bound, then a new GPU will work. If its already CPU bound in some games then while I don't doubt you'll get some improvement - you're already on the limit.

A new CPU and motherboard is often cheaper than the GPU upgrade, so its something you could factor in later. Call it your personal Tick/Tock in line with your gaming:)

I just went through this very decision process, but for a desktop machine, not a gaming system. I picked up a Dell Optiplex 755 with a decent Core 2 Duo CPU at a surplus sale. I upped the RAM to 8 Gigs and was quite happy, but then I started thinking about the graphics subsystem. This box had integrated Intel graphics, and that left something to be desired under Windows 7. So one quick trip to local computer store later, and for less than $40 I dropped an HD6450 in it and am quite pleased. The system now su

About a year ago I stuck a GTX 550 Ti in a machine that was at the time pushing five years old.

I generally upgrade video cards at least twice after the initial build of my computers, every 2 years or so. My needs for upgrading other components are generally low, because...really...who needs a top of the line processor? I generally stick to the top of the mid tier and it does anything I might need done for the next 5-6 years. As far as RAM goes, whenever I get a new motherboard I just put as much RAM as it supports in it, and have been known to spend more on RAM than CPU when building a computer.

I just recently rebuilt my computer (new motherboard, CPU, RAM, and a second GPU) for about $550, and that got it to a point where it can play Crysis 2 with max settings. I expect it will be able to play any game the makers throw at it for another two years before performance starts to become a real issue. Maybe longer, because it seems to me that game-makers are getting better at building games that still run (albeit less prettily) on older hardware.

If it hadn't been for some recent hardware failures I'd probably STILL be rocking the last machine, which would be over 6 years old now. I just didn't feel like throwing money down the drain buying a replacement motherboard that used and old-ass socket.

I think the only reason to buy absolute top-of-the-line hardware these days is to stroke your e-peen.

I'm using an Intel i5-2400/16GB RAM with two GeForce 9800GTs for dual-head. Older tech, but only one year old to me. Why? For a bargain price, it works great for me and was a significant improvement over the AMD Athlon XP-3000/Geforce 6200 I was using.

Latest/Greatest hardware is nice, but expensive. I also tend to play older games like Quake, Unreal, COD, MOH which run great on more modern hardware.

I threw a GTS450 into my socket 939 board with an AMD Toledo Athlon X2 and 3GB of dual channel, low timing DDR1. It was faster than my Geforce 8600 but not by much. I brought my 450 over to my new i5-2400 system and it was like night and day. This thing tore my games a new ass framerate-wise. It would seem the x16 PCI-E slot was holding it back on my old board compared to the new x16 2.0 or 2.1 slot or whatever. Plus, the PCI-E controller is in the i5 itself if I'm not mistaken. So as long as your boa

You can play almost everything from last year with quality visuals with an old CPU teamed with a new GPU. But here are the tricks:* You need at least 2GB of memory. If you don't have this, don't even try.* The CPU must be dual-code, at least. Single core CPU don't work anymore (tried both on the same machine, difference is night and day, it just happened that I could access a compatible dual-core CPU for free, otherwise it would have been impractical). If the CPU is not dual core, it does prevent decent performance, even with a top notch GPU.* Upgrade the HDD to SSD. The older HD that comes with your 10 y/o rig will slow everything down. This is the second most beneficial upgrade beside the video card.

I upgraded my Q6600 because I couldn't buy a motherboard for it when my old one died and I was in a financial position to overhaul completely. The whole thing lasted me around 6 years, including 2 graphics cards upgrades, and was still a capable machine when it died. The difference between my old PC and new is staggering, but that doesn't mean the old one wasn't enough. Hopefully I'll get the same use out of this new one. FWIW, you're not missing out much with graphics a little lower than top. Not much at a

I had a Q6600 as well before my latest upgrade. I had SLI'd 9800GTX that I upgraded to a GTX480 and saw some improvements. But about a year later going to a Sandybridge i7 @4.4Ghz I saw framerates double across the board using the same single GTX480.

A new GPU will improve an older system, but new CPU, DDR2->DDR3 also gave a later performance boost.

I'm also from the Q6600/8800 GTS crowd. I built my work computers around an IB i7 and cutting edge GPUs, but my home computer still uses the venerable C2 Quad with no problem.
I think the upshot is that recent computing power is and has been cheap and plentiful. A machine from six years ago is not challenged by everyday computing, only the most demanding eye candy at optional settings in some games. The hardware capabilities have outstripped software requirements.

I'm still using a Core2Duo E6300 from 2006 with 4GB of RAM. Until recently I was using a 9800 GTX, which, yeah, was fine as long as you didn't turn everything up to full. I recently traded up for an nvidia GTX 570, which was passed down from another more recent machine. Quite a nice improvement. The article is right that if you're going to upgrade anything on an old machine, the graphics card is probably it. Midrange now (i.e. ~$200) is usually a substantial improvement. On the other hand, I thought t

This is pretty much what I was using before I did a full overhaul during the past few months. Between replacing the CPU+Ram+Mobo, graphics card, and primary harddrive, by far the biggest improvement was replacing my old harddrive with an SSD. The games already ran smoothly on the old hardware on medium-high settings, so the upgraded processor and graphics card really only let me notch up the settings back to max, but ultimately resulting in the same frames per second. But the quick boot/wake and fast level loading made a tremendous difference. Even the split seconds saved in regular desktop use made the user experience change dramatically.

I had a comparable processor, which I bought on the christmas of 2009. However, some new games such as Mechwarrior Online and Planetside 2 are heavily CPU-bound and the machine was lacking when running them. I upgraded to i7-3770K and the improvement was dramatic (30-40 -> 60fps for MWO and 40-50 -> 90fps for Planetside). The graphics card did not change, as it was already rather powerful (Radeon 6970) and not a bottleneck on the detail levels I was using.

This was literally the difference between unplayable and playable, so if you play those games, there absolutely is a reason to upgrade.

That's true, especially of games that are not properly threaded to take advantage of multicore and manycore configurations. But there's diminishing returns... most games that are CPU bound don't really benefit from more than about 3GHz. Games that are properly threaded will benefit more from an extra execution thread (hyperthreading or an extra core) than they will from a faster clock speed.

Case in point, my gaming machine is a Core i5 2500k, with 16GB of RAM, and a Radeon 6970 graphics card. When I overclo

I play Mechwarrior Online with a Phenom II x4 965 which is ~ first gen i5 of similar speed and it works fine with a Radeon HD 7850 factory OCed video card. Perhaps something else was throwing undue strain on your CPU... I believe the 7850 is about the same performance wise, as the 6950/6970.

I have a Core 2 Duo E6750, overclocked to 3.2 GHz (it does 3.46 GHz but I got a couple blue screens and a kernel panic a few months ago, so I tuned it down), matched with a GTX 260. The machine was bought in November 2007, the GPU was bought in early 2009 IIRC, I have not changed anything since then (just bought some external HDDs), and games work great.Those games which don't support Windowed mode (maximized) I play at 1650x1080 Windowed, everything else becomes Windowed maximized. I grown to dislike full-

Pretty much the same here... I got the Dell XPX 730x at the end of 2008 or beginning of 2009 (as there was a 25% off coupon code; probably because Dell was phasing out the XPS gaming line. I got the $300 Logitech 5.1 sound system too as it also received the 25% off treatment and was THX/Dolby certified so it works great as a home theatre system where my computer is now hooked to a 27" LED and a 46" LCD TV) with the i7-920, 6GB DDR3 and a Radeon HD 4850... I've since upgraded the RAM to 12GB (maximum the boa

this doesn't surprise me one bit.. the GPU does most of the heavy lifting anyway, when it comes to games

still, an i7 will show you substantial performance enhancements

It's a bit more nuanced than that: certain upgrades lean almost entirely on the GPU(say you get a fancy new monitor and want Game X to look good on a 1920x1080 or 2560x1440 instead of a 1280x1024); but you can run into situations where no CPU is really enough CPU(RTS pathfinding in games that permit a lot of units is a particularly hairy case. Supreme Commander, say, can merrily chug along at 60fps with a screen full of units cranking out idle animations; but a few hundred bots scrambling to navigate can bring your CPU to its knees.) It's certainly a less common issue than an inadequate GPU; but it can happen.

Or late game in Sins of a Solar Empire where my GPU is sometimes almost idling because the one CPU core its using is at 100%. I would have thought running hundreds of AI subroutines would be easy to multithread. Makes me want to get a i5 to replace my AMD Athlon II that was mid-low end when I brought it 3 years ago

You are absolutely correct. I made this call a couple years ago when I got the 6750 video card. It's a fine card, but my cpu was an older model dual core AMD. Lately I picked up Planetside 2 and started playing it like mad. What I found was the game looked great, but was laggy as all get out because the developers of that game rely heavily on the CPU, probably heavier than they should. After biting the bullet and building a new system with a i5-3570K overclocked to 4300, with an SSD 200gb main drive, a

I have an i5 with an SSD and an i7 with a traditional HDD, both have very similar GPU's, both run about the same for demanding video games ('world of tanks' at highest quality to be specific). Prior to installing the SSD the i5 was clearly an inferior setup and could not cope with the game without setting the quality slightly above "total crap". I've had that setup for about a year but I also had the SSD replaced under warranty after about 6mths of use, overall I think it's been well worth the dollars and t

I've got a q6600 at 3ghz with 6GB of RAM. Have recently been eyeing SSDs - only downfall is running @ sata2 speeds, however the random access should work out nicely.

My biggest issue is I can't expand the memory much more without a much higher cost. DDR2 is expensive - almost 2x more expensive than the DDR3 counterparts. I recently rebuilt my file server from a lowly athlon-xp somthing-or-other to an AMD A4 - man, what a difference there. But I don't think I'll see too much improvement yet other than on

Frankly games even to this day use so little CPU that an i7 for gaming is honestly overkill. You can take any $70 Athlon triple and have a great time gaming on it as long as your GPU has a little muscle. Oh and this was one of the nice things about socket AM3 lasting so long on the AMD side as you could buy a dirt cheap dual in 08 and have upgraded to a quad or hexacore right now for very little without having to toss your board and RAM.

If I were building a gaming PC today on a budget I'd probably go for th

my old spare machine which is a amd x2 2.4 ghz, 4 gigs of ddr2 and a geforce 9600GT can run most modern games at mid high to high quality at 1280x1024, and that's coming up on 6 years old... cost like 300 bucks in parts new

Most newer cards with the desirable features consume lots of electricity and at this point in time actually as much as a refrigerator it would seem. They also generate an excessive amount of heat as well. Before purchasing make sure your power supply is up to the task or you will be in store for some interesting side effects.

When playing games, my i7 + GTX 660 system takes a whole 200W at the wall. My Pentium-4 with Nvidia 7800 used to take more like 350W.

I dispute that claim. I've worked in middleware, where we put insane amounts of effort into utilising multiple cores (kinda required for the PS3/360), and pretty much all 3rd party middleware is now happy to run across multiple cores. The more middleware a game uses, the more likely it is to make *relatively* good use of the cores (certainly much more so than most software products). If you're targetting a game at iOS, then assuming it isn't some tedious zynga-style-freemium game, you've probably put a fair

You're kidding right? My brother is always complaining about having to pay various fees, as he says "To even turn my Xbox on!" sometimes. That and $70 games are enough to put a person in the poor house. The article is correct in saying a new GPU will boost performance on an older system. But don't buy a super high end card for it, eventually the CPU will become too much of a bottleneck.

I think I'll stick to my PC with Xubuntu (I actually BUY games I like). Oh noesss I mentioned Linux, will I be Modded as a

My problem is I use my xbox for primarily single player offline games, and probably once every two weeks fire it up for a 30 minute session. Pretty much every time I turn it on, there's an xbox update that HAS to run. And that makes me sad.

I get it, they need updates. But I would love to have a console that checked once a day or once a week for updates and just silently did it in the background. Similarly, I can "purchase" demos or full games on the xbox live site, but I have to turn on my console to st

This is not a valid statement and has never been. Entirely different games are marketed for PC and consoles, and if you want to play all the games you enjoy, eventually you will be forced to own both a fast gaming PC and a console.

No need to use a VM. Just use compatibility mode in windows 7 or 8. 8 does some cool stuff with retro games (automatically sets the correct settings for many games). I'm currently playing black and white as well as neverwinter nights... Both Win 95/98 era games. No issues. I run lots of retro games.

Also. If you're trying to get 3d to work in a VM, you need to use drivers supplied by the VM vendor in the client and enable the relevant settings on the host.

Back in the P4 days PC Magazine ran an article - best bang for the buck with a $200 upgrade. (At the time, $200 was the going price for a current mainstream CPU, 4 Gigs of RAM or a solid-performing graphics card. After testing all the various options on a 3 year-old computer showed that the money was best-spent on a graphics card upgrade.

What they found was that in windows, for a typical desktop user, everything effectively went through the video system, and a slow graphics card could hamper the fastest CPU