Posted
by
ScuttleMonkey
on Monday November 07, 2005 @01:42PM
from the bargain-basement-with-a-twist dept.

mikemuch writes "Today Nvidia unveiled a new low-cost, high-power graphics processor SKU. ExtremeTech's Jason Cross has done all the benchmarking, and concludes ' This makes for an impressive bargain and a huge step up from the generic GeForce 6800. The big question: How will this fare against ATI's similarly priced X1000 series card, the Radeon X1600 XT?'"

Yeah, Extremetech is after all a big tech publishers attempt at a tech enthusiast site. If you are in the $250-$300 range then you should spend $33 extra bucks and go with this evga 7800GT. [dealtime.com] It is worth the extra chunk of change. Not only will it be much faster than the cards that Extremetech recommends, but it also uses less power than the 6800GT, and therefor puts off less heat. That is a no brainer in my book.

I would get 7800GT, but the problem is the lack of PCI Express slot in my motherboard. I am not going to upgrade my motherboard any time soon, but I do need to upgrade my video cards for the latest and upcoming games.:( I am stuck with 6800 series.

I think this shows everything thats wrong with the tech review industry. They Adver-review cards pretty much only for kids to drool over and feel bad about their existing card that works just fine on pretty much every game they play; and 'Enthusiasts" IE, one born every minute.

Instead of working as a consumer reports type site, where If i want to buy a good graphics card for my ~700-1100 dollar computer (Not my 4 grand alienware) I would be digging through archaic reviews from a few years ago with test results on old drivers.

Wow, this just in, a 700 dollar card dual SLI card can play games at resolutions larger than my monitor can handle, at colour depths the human eye can't discern, at a framerate so fast the human eye doesnt pick it up, on a game that probably wasn't made to take advantage of the card, and with an actual visual performance increase I can barely notice. But the good news is I smoke em when I run a benchmark utility.

Yeah, no kidding. I am *very* used to running games at moderate detail and with all of the AA/AF turned off as I have Jurassic-era equipment by gamer standards (P4-M 2.2 GHz running a AGP 4x Radeon 9000 64MB.) It does not make that big of a difference to me anyway as it looks nice, but in a FPS game, do you just stand there admiring the scenery? No! You run around and shoot the bad guys.

And I laugh any time I see people doing CPU framerate comparisons at 640x480 or 800x600 with everything dialed down and j

TV is 50-60fps (though it is interlaced nterlaced). The other advantage TV and Movies have to help avoid flicker is the motion blur that they get for free. Video cards render each frame as if the stuff were frozen in place and a picture were taken.

Wow, this just in, a 700 dollar card dual SLI card can play games at resolutions larger than my monitor can handle, at colour depths the human eye can't discern, at a framerate so fast the human eye doesnt pick it up, on a game that probably wasn't made to take advantage of the card, and with an actual visual performance increase I can barely notice. But the good news is I smoke em when I run a benchmark utility.

You can notice 4xAA and 8xAF turned on both visually and framerate. When everyone is running 192

I've been tracking video card reviews for years. Typically the performance of a GPU doesn't change much subsequent to its introduction. What would be the value in doing a subsequent review?

Most of the top review sites keep a generation or two of older chips in their comparisons. Some even compile regular guides on value and midstream priced parts. If you can't find information on cheaper video cards then you aren't looking hard enough.

Nice of them to cut the price. I would like them to keep the SKU so I didn't have to keep up with anotherone: Although I suppose if they hadn't rebadged it, everyone who bought the 6800 would be pissed at the price cut.

Although I suppose if they hadn't rebadged it, everyone who bought the 6800 would be pissed at the price cut.

Isn't that what happens with technology... prices go down? I got a 6800 for Christmas last year, a black friday CompUSA deal for 200 bucks after rebate... By this time, I'd almost expect it to be down around 100 bucks.

Also, on another topic, on some of these cards you can use RivaTuner to unlock the extra pipes and pixel shader, too... great if it works, but of course it's not guaranteed. Mine,

It's not simply a 'rebadged' card. Not only did they bump the clock speeds from the 6800's 325MHz core and 700MHz memory to 425MHz core and 1000MHz memory, they also switched from DDR to DDR3 memory to achieve the new memory clocks. This is as much of a difference as there is between the 6600 and the 6600GT.It's not so much of a price cut on the 6800GT as it is an clock-speed (and price) boost to the vanilla 6800 that brings its performance to the same level as the 6800GT while still keeping a lower price

We are often asked "Which video card should I buy?" We always answer with "well how much do you want to spend?" The inevitable reply is that everyone wants to run all the latest graphics-heavy games at high resolutions with all the features enabled, but they only want to spend $100 to $150 to do so. Sorry to say, but that's just not going to happen.
The real sweet spot for graphics is in the $250 to $300 price range.

I cannot express how frustrating this is. People, please do not spend more than $150 on video card. This is just insane. I guess we do need people like this to keep the graphics market hot by paying $300 for a card. I just hope game manufactures don't think that their games should require $300 cards.

What else are young gaming geeks going to do with their money? They live at home in mom and dads basement with 100% disposable income; 300 bucks for a new GPU is nothing. It's a hot-rod culture, rather than mustang parts, it's computer parts.

Yeah, you're partially correct. The thing is, most games I've seen don't *require* these hugely expensive video cards to play them. They only need them to run in "high detail", with all the "eye candy" options turned on. If you turn all that stuff down, the game will be quite playable on a much less expensive setup.But so many gamers can't stand the fact that a game can possibly overwhelm their computer, so they fork over the money to upgrade - and then complain about it.

I just hope game manufactures don't think that their games should require $300 cards

Simple - OEM pressure. I can confirm this because I have a friend who works for Microsoft and I asked him why is that every year we are forced to upgrade. Can't you guys do with what is already available? He told me that they can optimize the systems to run far better on existing hardware but the OEMs don't like that. Dell apparently wants users to upgrade every 2 years or so. Bottom line - they don't care about end user. They know that the end user will spend to use the latest and greatest software.

But Dell has basically zero bargaining power against Microsoft. What, are they going to sell all of their PCs without Windows on them? They'd go under almost instantly. For consumer PC operating systems, Windows is the only game in town right now. That means Microsoft can do whatever they want and Dell just has to take it.

It's a tradeoff. If, for example, Microsoft were to add useful features, improve stability & security, reduce memory and disk footprint, and improve performance, then they would possibly get more money from people upgrading their old computers to the new OS. Right now, people think "upgrading means a new computer--that's too expensive!". If all they had to do to get a better-performing machine was to buy the new version of MS Windows, it would be a smaller sticker shock. And more people would want t

Take a look at your argument from Microsoft's perspective. Every new Dell sold (nearly) comes complete with the Microsoft Tax. How many people get a new machine and think hey, I'll just re-use my old XP license on this new machine and save money? Nobody does, and the option isn't even presented to them.The more machines Dell sells, the more money both Dell and Microsoft make. Same goes for any other OEM that's selling computers. Microsoft will NEVER improve the OS to the point where it makes old machines ru

This is especially true when the newest console is only $300. I like PC gaming more than console gaming, but in the last year, i've switched to consoles because its just so much cheaper. In about the time that a console stays around, 3 years, you'll upgrade your video card a couple times, or upgrade it once and spend twice as much. Meaning that just the video card(s), not including all the other upgrades necessary will cost as much as the console. I got tired of trying to keep in my head which video card is good, because there is about 75 models out there, and which one has the proper drivers to support the games I will want to play. Also, what bothers me is that if I upgrade my operating system, my video card which is a few years old might not have supported drivers, or if I buy a new card, it may not work in my older operating system, forcing me to upgrade. I really gave PC gaming a chance, but there's just too much hassle. I'd rather put up with games that don't look quite as good, or maybe are a little less fun to play, for not having to deal with the frustrations of playing games on PC.

It's especially untrue if you already need a moderatly high performance PC for other things already. If you're going to have the monitor, and the CPU, and the memory already, buying a $250 video card for gaming is $50 cheaper than a $300 console.

Like I'm one to talk though... I buy the consoles, and the video card.:)

Actually, that doesn't bother me at all - if people want to spend $300 on a GPU, more power to them. What bothers me is that the headline presents a $150-$200 GPU as 'affordable'. $200 for a GPU is a lot of money for some people, and this is specially true outside the US where exchange rates and taxes come into play. I was expecting a sub-$100 GPU, a-la-FX5200.
BTW, i do own a FX5200 and i'm able to play Quake 4 with special effects perfectly fine in it - yes, 640x480, but it still looks and

Once again, I need to upgrade my video card (ATI Radeon 9800 Pro AIW; 128 MB) just to play the newest and upcoming games even at 1152x864 resolution with all graphic options to the maximum. I have to do this upgrade every one to two years ever since 3D cards were born (Diamond Monster 3D/Voodoo1 card as my first one)! At the same time, I am stuck with AGP slot on my motherboard since I am not upgrading it any time soon.It looks like I am aiming for a GeForce 6800 (128 MB; AGP) to buy in a few weeks. I am no

Instead of dropping $300 on a new console (i.e. XBOX 360, PS3) every couple of years or so, I'll use the $300 toward a new GPU since I prefer gaming on my computer. So, in the end it works to be the same thing.

Give it time. Remember, graphics co-processors entered the game quite a bit after their general processing counterparts.

Just as desktop CPUs are leaving the era of High heat, High power, balls to the wall performance busting, GPUs are entering it. I'm sure when people start to realize their 1GHz graphics card has a cooler bigger than their old P4s solid 400g piece of aluminum and a fan louder than a trainwreck the industry will come to its senses.

but it seems that GPUs are destined to waste all the power [watts] modern CPUs are saving

This is largely because of the completely different design methods and timelines in the two fields.

CPUs are designed pretty close to the transistor level. They optimize the crap out of them, and try to do the most work with the least transistors. You have a lot of flexibility in changing the die size, the power consumption, and so forth. You can also ramp up the clock speeds to insane levels -- 3-4 GHz currently. This a

Yup, all computer equipment now costs far more to operate then to buy. Electricity is not cheap.I'm waiting for someone to make a serious (everything I've seen is a toy/junk) Mac mini-like AMD64 box. Apple is going to do that soon, and they are gonna sell a billion of those if they also run Linux and Windows, and there is no reasons they wouldn't.

I'm just fine with no special effects and 10 FPS. That my PC sounds like a small engine and puts out enough heat that even in the winter I have to open the window

Granted this is a rough approximation, but it seems that GPUs are destined to waste all the power [watts] modern CPUs are saving.

You have a point, although the biggest need for low power CPUs is for laptops (i.e. running on batteries) which you typically wouldn't use to play 3-D games anyway, if they could even contain the high-powered GPUs to which you're referring.

Athough this is indeed a good need for low-power CPUs, they are still only used a fraction of the time and basically only save battery power. However, I believe that lower powered servers, which operate 24/7, would benefit everyone and the environment more even though they are arguably fewer in number.

Yeah, they lost me after the GeForce 4. And by the way, I'm still using a GeForce 4 because of the cryptic scheme and lack of comprehensive reviews for the cards that don't cost a fortune. It's the same with AMD and Intel and their new naming schemes. It's harder to tell now which components are newer than the others.

Specifically, the "Compare Cards" feature on the left. I just upgraded my ATI 9600XT to a nVidia 6600GT AGP (because I'm not yet ready to drop a grand on an all new PCIe 64-Bit system), and that site helped me decide what was "enough" of an upgrade for how much money I was willing to spend.

What's also EXTREMELY frustrating is that most review sites only benchmark all the new cards versus each other. When I'm looking to upgrade, I need a comparison to older cards (ie, like one I might own. currently an ATI Radeon 9000 Pro) to judge not only how fast a card is versus the competition, but also how much faster is it going to be versus what I currently have. It does me no good to know that the Geforce FX Platinum Value series is half as fast as the normal series at 1/4 the cost if I actually do

This is great, but this title seems like an oxymoron at first (NVIDIA = Cheap?) They used to make cheap video cards in the past that were crippled and preformed poorly (the GeForce 4 MX cards.) A good NVIDIA card used to cost 1/2 the price of an affordable computer, around $400. The last time I checked, all the value cards were around this $100 price range. I hope they can actually make something that's cheap and decent.

You can probably get that previously $400 GeForce 4 card now for around $80. Probably would be more than enough for most people.

Now don't believe the hype, games work fine on older cards. I have a 9800 Pro at home, and I haven't yet encountered a game that's a problem. No, you can't crank the resolution and details and such but it plays all games, even new ones, fine.

The only game I've seen that eats a baseline card is F.E.A.R. From what I can tell, this has a *lot* to do with possible bad programming, as the graphics compared to many other games I've played (just fine thank you on an FX5200) really do suck even on the higher-end cards.

You say you don't need a new video card at all. You say yours works fine. Great. Don't get a new video card.

I don't know how much time you spend playing graphics heavy games (e.g., the new first-person shooters, Battlefield 2, for instance). I spend quite a bit of time with them. Computer gaming is one of my hobbies. Do I _need_ a top of the line graphics card? No. I don't. I bought one because it makes my hobby more enjoyable. I didn't _need_ a les paul guitar, either, just to bash away at it in

Go ahead and blame the industry. These web sites, magazines, etc are all the same. Their "reviews" are just buzz-words, hype, and an advertisement for whatever new gadget.

Sure, some of those sub-$200 deliver more frames per second per dollar spent, but they're generally just not fast enough to run those really graphics-intensive games at decent resolutions, unless you're willing to go into your game options menu and turn the details down to "medium."

I just hopped over to newegg.com and they were listing the first 6800GS for $209. The lowest priced 6800GT is $269. The lowest priced 256MB 6800 in PCIe is $209 (there are cheaper 128MB cards on AGP, but I wanted to keep the numbers relevant).With the performance being nearly identical between the GS and the GT, the result is a 20% drop in the price at this level of performance (or a major boost in performance at the $209 level). Either way, I think it's fair to call it low cost, as long as you qualify t

Is there a technolgical reason why multiple GPUs can't be put on a card? I freely admit I know very little about graphics cards but it seems like it might be a cheap way to make a very powerful card. I seem to remember there was a card with two processors on that failed dismally because basically twice the price. What about a card with 4 or 8 cheap processors? Ok the power consumption would be silly but as long as it could be throttled so that when not playing a game only 1 GPU was used it might work. Just thought I'd share that with you all:o)

Um, been done many times before. Not only do you have SLI, which combines two cards, but there are several "SLI on a single card" monsters with two geForce 6600s or 6800s on a single card. The first dual GPU card was way back in the day, I think it was an ATI Rage. Also, Creative makes high end workstation graphics, and they have a non-SLI dual GPU card. Are you talking dual core? Well, it will probably be done soon enough, the problem is that the software support for multiple GPUs is really crappy (SLI is really not that practical for everyday use). Now, at least with PCIe, the hardware restrictions imposed by AGP are gone. I would expect to see something within six months, probably from SiS. It might take a little while longer for nVidia and ATI to come out with a dual core card, although I'm sure it will perform better.

Once upon a long time ago I worked for Control Data Corporation (anyone remember them?). CDC had a trick, which wasn't new to them, of re-badging essentially the same system with a new model number and a lower price. An example at the time was their popular CDC 3300 mainframe becoming the CDC 3170. The only difference between the models was that the CDC 3300 had a 1.75uS clock, compared to the CDC 3300's 1.25uS clock. Move one wire (the right wire!) inside and the CDC 3170 became the CDC 3300 in all respects except for the name badge on the equipment bays and console.

Why do this I wondered? The problem was in government contracts. After you'd paid back the design costs addition computers could be pumped out at a cheaper price while still both making a profit and remaining competitive. The fly in this ointment is that the government, who often bought quantities of the earlier models where cost was not the first concern (when has cost ever been a concern to governments spending tax money?). I was told that the government contracts stipulated that if you ever lower the price on something you've sold them you have to rebate them the entire difference on every system delivered. Of course that would bankrupt any company, so they resorted to this rather transparent subterfuge.

While I don't doubt this is true, I heard a similar story 20 years ago about a company that sold two models of mini-computers. Apparently the only difference between the two models was a wire that had to be clipped to achieve the higher performance.

The only problem is I've heard the story told about 10 different ways. I'm wondering if it's actually apocrifal?

There is plenty of news to back this up. The original 487 was a fullfledged 486 with the FPU enabled, most modern processors are actually made in the same process. The "ideal" few processors are the ones rated for the higher clock rates. [that's not universally true, eventually you need a new design to get higher rates].In that case though that's because lithography is not a perfect process and errors [e.g. skew, heat, etc] can make it unstable at higher rates. That's why you'll see "worst case 100C" li

Notice that it has less pixel pipes. There are 4 blocks of 4 on a 6800 series chip, and one of those is disabled. However, the chip is clocked faster. My guess is they have found that they are still having a number of chips that one of the four blocks will fail on, espically at higher speeds. Ok so just make a new line of cards that only has three active at a higher speed and sell it. Gamers are happy, and you get to use more of your production capacity.

This is an NV42 chip, not NV45. It doesn't have 4 quads, just 3, not to mention it's made on the 110nm process. Nvidia is selling this because a 110nm chip with 3 quads is a good deal cheaper than a 130nm chip with 4(the 6800GT).

While I've been enjoying my 6800GT and 7800GT cards, I'm worried by the fact that ATI can't seem to keep up. Ever since they lost the dominance they had aquired with their 9700/9800 series, They've been behind in performance, street dates, availability AND prices. It's already been 2 generations now. Any gamer knows that, today, nVidia reigns supreme.I hope that ATi regains the upper hand in the next round because things are looking grim for them. nVidia is a bigger company with bigger coffers and better ma

I hope that ATi regains the upper hand in the next round because things are looking grim for them. nVidia is a bigger company with bigger coffers and better marketing skills so they can withstand bad times more easily than ATi. They handled the whole 5700/5800/5900 debacle very well considering ATi's offerings ate them alive back then. God forbid ATi should go bankrupt and we end up with a defacto nVidia monopoly!

Except ATI is providing chips for console manufacturers, and probably will make plenty of mon

I'm worried by the fact that ATI can't seem to keep up. Ever since they lost the dominance they had aquired with their 9700/9800 series, They've been behind in performance, street dates, availability AND prices. It's already been 2 generations now. Any gamer knows that, today, nVidia reigns supreme.Two generations - with one generation being six months, one year isn't that much to worry about. Who knows what new DirectX/OpenGL extensions will be invented in the future; superbuffers, real-time ray-tracing, r

It seems to me that console chip development must draw budget away from getting the absolute fastest card out in the shortest time, because once a company starts working with consoles, the other guy gets that 1% faster card out a month before you.

It happened to nVidia and it's happening again to ATI. It probably would have happened to Voodoo too if they hadn't self destructed before console companies realized that they couldn't develop everything in-house anymore.

They also currently win in GPU volumes (and revenue). Intel is the largest GPU manufactuer followed by ATI, and NVIDIA in a close third. Thing is ATI and INTEL currently have greater OEM/Integrated market penetration than NVIDIA.
So while INTEL is the biggest know anyone who actually can game on one of those integrated decellerators?

It was the cheapest "non-crap" PCI-E from nvidia I could find. And you know what? It plays Far Cry, Thief3, Battlefield2 and the others JUST fine.

This bullshit article about "needing a 6800GT to enjoy the games" is just that. Bullshit. Sure the game may look shinier at 1600x1200 with 200fps and a billion texels/sec or whatever... But if that's what it takes to make the game "fun" we're obviously not playing the same games.

Point is this article is all about selling the latest bullshit cards you don't need. A 6600 will do you just fine if you're an average gamer [e.g. you have REAL work to do the rest of the day], it can play games at 1024 and 1280 reasonable well [very well at the former].

If you're on a budget and you think you need to spend 250$ USD [keep in mind 179$ I'm talking about is Canadian not USD] to enjoy games... you need a few moments of education:-)

This is just a press release disguised on a 30 page article [chalk full of ads no less] to sell the latest and greatest...

The 6600 isn't a bad card, and if you're on a budget I'd totally recommend it. On the other hand, you can get significantly better performance for more money.Now, I'm not saying that these games aren't fun at 1024x768 with dynamic lighting turned off, blob shadows, and "medium" resolution textures, but it's still like the difference between watching a movie on an old television versus seeing it in a theater.

If you have the money, you can make your games look significantly better for the price of two games.

Ah, but enjoyment is relative, isn't it? I average about 30 minutes per day of Battlefield 2, and occasionally will play through another game (I'm on interval 2 of FEAR right now). My previous system was an Athlon XP 2500+ w/ a gig of RAM and a 6600GT AGP. While it ran the game at 1280 (my LCD's native resolution), when involved in firefights my minimum framerate typically dropped to below 10 frames per second. It just got too frustrating for me to be continually killed, not because I lacked the skills (tru

The cheaper model has 12 instead of 16 pixel shaders, and 5 instead of 6 vertex shaders. They probably use the same chip. The benchmarks are close. $17 cheaper. Big deal.

In terms of price/performance, Via is probably the leader. They've just introduced some new S3 Chrome [techspot.com] boards that are roughly comparable to the GEForce 6800 line, but are priced around $150. That technology will probably be in Via's motherboard chipsets soon, at an even lower price.

Okay, I'm not a gamer.. So, this pretty much eliminates me from the target market of any of the new video cards. But, I am willing to pay quite a lot, for a video card that does things I am interested in. Such as:- Video acceleration. Full MPEG decoding (not just iDCT+MC offload) for MPEG2, like the Unichrome video chips do. Full H.264 decoding is even more important, given its growing popularity and huge CPU requirements.

- Open Source drivers, with full functionality. Good Linux support, enab

Disclaimer: I make extensive use of both nvidia and ati hardware under GNU/Linux.

Nvidia is really the only way to go for 3D in linux. If you really only need 2D, I've heard good things about the old Matrox cards, but good luck finding one.

Not true. The proprietary ATI drivers (currently version 8.18.8) work as well as the nvidia drivers on both my amd64 and x86 boxes. Nvidia works fine (except for incessent flickering at 1920x1200 on one machine), as does ATI (but no flicker on that one machine). ATI works better ati 1920x1200@60Hz, but nvidia draws specular hilights on a celestia-rendered hi-res Earth better that ATI. In short, its a wash, with each manufacturer/driver having strengths and weaknesses the other does not.

The choice these days is one of personal preference. Your comment is at least a year behind the current state of the art, at least in the GNU/Linux world.

Too bad you make no mention of the lackluster performance of the ATI drivers for Linux, it seriously sucks compared to the Windows drivers. Sure you get hardware accellerated 3D with the drivers, but it's laughable how they perform.

I really wanted to keep my 9800 Pro, but this GeForce 6600GT just performs worlds better in 3D under Linux, and it performs just about equally in Windows. Plus the drivers are a bit of a PITA under Linux, imho, but that's just me.

I'm running a blank GNOME desktop (just put on the default install; haven't customized anything) on an Athlon T-bird 750 with half a gig of memory. It's not grinding disk or anything. It really looks to me like a lack of 2D acceleration.Also, while being impressed by the Xscreensaver demos, I noticed that some of them displayed artifacts (triangles with one vertex stuck to the left side of the screen). I figured this was due to bad OpenGL support on the card, which also led me to blame it.

Are you sure that your driver is actually functioning and that you're not using VESA drivers? I've used Linux on all sorts of cards (ATI, S3, Trident, Nvidia, 3dfx, and probably a few more), and I've never had one result in laggy X11 in 2D (3d is another story entirely though).

The r128 driver is listed in the X config file that was generated on install. Also, I'm pretty sure it's using hardware acceleration for the 3D effects, at least, since some of the OpenGL-based xscreensaver hacks suffer from severe artifacting--software acceleration would be slow, but correct.

I clearly should have discovered x11perf before writing this post, so I could at least have some numbers to complain about. I can't even tell if the problem is GNOME or X in general. I suppose I can run some x11perf benchmarks and compare them to... something.

I would highly suggest getting a card from the NVIDIA FX series. The linux desktop of tomorrow will require more and more 3D acceleration. IMO, you are simply wasting money if you buy anything cheaper. I would recommend the FX-5200 (which is what I have btw). You should be able to pick one of these up for around $50 (or less) + shipping.

I'm still real happy with my FX 5900, and I'm not planning on upgrading it anytime soon. If you can find one of these they are much better than the other 5xxx series. I've always used Nvidia cards myself, and they work nicely in Linux.

If it's just for Xwindows, I'd suggest an nVidia 5200FX card, because right now, they are very cheap/budget friendly. I use it currently at home and it runs UT2004 and other OpenGL apps/games/screensavers beautifully. I'd not rec'd it for windows games anymore, though, but for linux, it's been a beauty of a card.

My laptop have a mobile FX5600 chipset in it and runs great. My desktop ran very happily on an FX5200... good enough to run games like Half Life 2 and Quake 4 with ease (main limitation being system RAM).

The FX5200 is a sub-$100 card now... starting to get a bit dated for new games but still good for current offerings as well as desktop.

I'll make sure to try that out when I get home. So I just apt-get install xdm, and change... some config file to point to xdm instead of gdm? I'm sure it's something in/etc/X11 that I don't know off the top of my head. (I'm not looking to install the whole KDE megillah.)

I have a Geforce 4 at home and a Geforce 6800 GT at work. Both work very well under linux. No its not open source but the installation program compiles a custom interface if it can't find a standard one that will just work.

It took me 17 keystrokes and 2 mouse clicks to Google "acronym finder" and look up SKU to determine what it meant (I'm not going to do your homework for you). You know, Google's not that hard to use - try it some time!

The AC is correct. The fastest, last AGP card from ATI was the X850 XT PE. If you want anything faster or new, it's only offered in PCI-E. To be frank, this pisses me off. There is a whole market with people running fast CPUs and DDR 3200 memory that do NOT want to swap out their motherboard. I cannot imagine why in the hell the current crop of video chipset cannot handle the bandwidth provided by AGP 8x. I mean, clearly there is a market for AGP cards.

I'm sorry, but I will not swap out my CPU and motherboard just so I can install faster cards only available in PCI-E.

Same here, except I got a Shuttle XPC, last generation of AGP. I also got a 6600GT as a stop-gap, on the other hand I might not upgrade for a good while anyway but I wish I could be a little more future-proof.

Dude, that chart is terrible. You forget that the GPU is least important piece of hardware, and their are other components utilizing the power supply. Using 200 of 300 watts is decent, but it isn't going to cut it. What about the CPU, case fans, and the cdrom that now ALL games require. How about they rework the way the GPU functions - something similar to what intel did to make the Pentium M.

Look at this [anandtech.com] website to get another look at power consumption.