ATI’s Radeon 4890 kicks NVIDIA where it hurts

Another month, another round of GPUs, and ATI's HD 4890 may preview more than …

Ever since it launched the HD 4000 series last summer, ATI has been sitting more-or-less on top of the GPU performance market. The GT200 series of cards haven't been bad products, per se, but the HD 4600 and 4800 lines have made it difficult for NVIDIA to create and maintain any serious momentum in the OEM space. The Green Team's fortunes seemed to be changing a bit recently when the GTX 295 arrived and took overall top honors as the best single-slot/dual-GPU solution on the market, but these cards typically account for only a fraction of either companies' earnings. ATI's new HD 4890, which arrives today, is meant to solidify the AMD subsidiary's position around the $250 price point—and, of course, hopefully entice a few of you to upgrade.

There's two ways to view the HD 4890. NVIDIA, of course, would point out that the 4890 is nothing more than a faster 4870 with some additional RAM bolted on the side to justify a higher price. The fact that this actually is a fair description of the HD 4890 doesn't help ATI's launch seem any more noteworthy—but we've not yet explored all the facts of the situation. Shall we have a look-see?

Quick, can you tell them apart? I can—but only because I've got access to secret information and UV glasses. For a real look at what's going on underneath each hood, have a look at the spec sheets for each:

Eyeball the above, and the Radeon 4890 does look an awful lot like an overclocked HD 4870, but ATI insists it has done quite a bit more under the hood than these specs show. According to the company, it has redesigned and retuned the entire GPU for higher clockspeed, lower power consumption, improved power leakage (meaning less of it), all of which leads to the board's lower idle power. The fact that the 4890 has cut the 4870's idle power by one-third, in fact, is no small achievement. One small error on this sheet that ATI notified us about, however, is that the transistor count on the new card is actually slightly higher, at 959M, rather than the previous 956M.

Performance

Anandtech has a great review of the 4890 out where the card is put through its various paces, specifically against the GeForce GTX 275 (NVIDIA's new answering model). The good news, for those of you who don't have a strong brand preference one way or the other is that the two cards perform quite well against each other, with NVIDIA's new drivers adding a boost in some areas and a new in-driver feature (ambient occlusion) in general. Part supply is a perennial question, and here AMD seems to have an edge—the HD 4890 is already widely available while the NVIDIA GTX 275 is going to trickle out and hit the market more gradually.

Anand also takes a look at CUDA, its promise, and the reality of shipping applications—the article, in my opinion, is worth a read for that information alone. NVIDIA has put a huge push behind CUDA (not without some reason) and we're starting to see applications ship that take advantage of its capabilities—but that doesn't mean the feature is ready for prime time or a major factor behind card purchases.

By the pricking of my thumbs...

Here's where I'm going to go out on a limb a bit. At its heart, the HD 4890 is a respun HD 4870. This type of respin is something a company normally does when moving from process to process—it's actually relatively rare for a processor to shrink from, say, 90nm to 65nm without some modifications to the core layout or design. Such changes may not impact performance in any way, but they can increase power efficiency, reduce thermal hotspots, or make future modifications to the architecture (adding cache, for example) much more practical.

We know ATI has already debuted processors on 40nm technology in the form of the Mobility Radeon HD 4860 and 4830. Now that we see the company respinning the 4870 into the 4890, it's not hard to play connect-the-dots. My bet is that ATI simply chose to perform the task of rearchitecting the GPU and die-shrinking it in two separate steps, with the rearchitected version (HD 4890) hitting the market first. Given how well the HD 4890 has worked out on 55nm, Team Red should be able to take its time bringing its highest-end parts to 40nm. I don't know about ya'll, but the prospect of an integrated GPU based on 40nm process technology and ATI's next-gen desktop part is downright sexy-sounding.

39 Reader Comments

Its really good to see ATI kicking ass and taking names. I am unfortunately stuck with an SLI motherboard so my only upgrade path right now is to go with the 285s paired up. I am not sure if I should even look at multi multi-gpu solutions...but if I had a vendor agnostic motherboard I would go with a couple of 4890s in Crossfire no doubt!

Anand also takes a look at CUDA, its promise, and the reality of shipping applications—the article, in my opinion, is worth a read for that information alone. NVIDIA has put a huge push behind CUDA (not without some reason) and we're starting to see applications ship that take advantage of its capabilities—but that doesn't mean the feature is ready for prime time or a major factor behind card purchases.

I'd really like to see ATI do more with it's stream computing. It seems Nividia has more mature offerings.

Yeah, I did some research and ATI's chips are even better suited towards DP FP operations (the kind you'd see in scientific GPGPU applications). However based on some synthetic benchmarks I dug up, the GTX280 still beats the 4870. I would presume its a driver issue at this point.

I try hard not to be a fanboy, but the 3-4 nvidia cards I've owned over the years never gave me any trouble. The one ATI card I had overheated all the time and had craptacular drivers (remember the Catalyst Control Center thing that required .Net but never worked regardless of whether .Net was installed?). My computer was very unstable with the ATI card. Oddly, when I replaced it with an nVidia card it was stable again.

So regardless of which company holds the "performance crown" at any given moment in time, one has disappointed me 100% of the time I tried it, the other has disappointed me 0% of the time. I know what brand I'm going to next time I upgrade...

I only have my existing ATI card because the day I visited the shop they didn't have the nVidia card I wanted, but I'm really glad now that I've been able to take a look and see what good both companies can provide.

Bring on the competition because it can only be good for us as consumers. Luckily I have no particular affinity for either company and will go with whichever provides the best experience for my money, so unless the product landscape changes before the end of the month means I'll be getting the 4890.

People go through that with hard drives: If a WD drive fails and you were too lazy to back up it must mean WD is a shoddy, data-stealing weasel company. But the guy next to you had it happen with a Samsung drive, and the woman down the hall lost everything on her Seagate. Every brand fails for somebody.

I've had no trouble with either nvidia or ATI cards, so I'll conclude both vendors are 100% reliable -- it makes exactly as much sense.

Originally posted by LlamaDragon:The one ATI card I had overheated all the time and had craptacular drivers (remember the Catalyst Control Center thing that required .Net but never worked regardless of whether .Net was installed?).

You can draw literally ANY trend line you want through a single data point.

Your Highness, that really depends on just what it is that's integrated. If ATI or NVidia could integrate a high-end current gen GPU directly with the CPU, within the next 6 months, i'd probably jump on it, it being which ever is cheaper, especially if you could add in a 2nd card and choose to either disable the integrated or do an SLi / Crossfire with it.

I don't know about ya'll, but the prospect of an integrated GPU based on 40nm process technology and ATI's next-gen desktop part is downright sexy-sounding.

Sorry, but integrated GPUs are not sexy, they are pathetic.

Well, that depends. If you're a (hardcore) gamer, then yes they are. If you're just a casual gamer with a HTPC (on a 720p TV for instance), then consider this:

- AMD's 780G is based on the "old" RV620, with 40 SPUs.- When AMD moved from RV670 to RV770, they shrunk the SPUs by around 40%, and through an increase in die size, moved to 800 SPUs (from 320).- The newer RV710 has only 80 SPUs, though.- Theoretically, shrinking a die from 55 nm to 40 nm divides its area by 2.- On the RV770 (don't know about RV620), SPUs make up for about 40% of the die.

I think you see where I'm going with this... by my rough guesstimate, a 40 nm IGP would have at least 160 SPUs, perhaps 240 or even as much as 320 without being significantly bigger than the good old 780G.

In other words, in terms of raw processing power, we could see IGPs roughly equivalent to a Radeon HD 3600 (120 SPUs), or even 3800/4600 (320 SPUs). That's not so pathetic now, is it?

Of course, compared to a real HD 4670 it will remain bandwidth-limited (and the 4670 already is), but the possibility of adding "sideport memory" will partially compensate that.

I built 2 780G machines in January, one for my wife and one for my mother. My wife's is built off of an Asus mini-atx board and my mother's off of a Foxconn board. Both machines run Win 7-64 with full Aero, both built in Antec Sonata III cases with 80Plus certified 500w PSU's, they consume 40-50watts 'sleep' mode, 10-15w in 'hibernate' and 80-90watts total in full use (measured with a Kill-a-watt, good enough for a rough estimate for power bill impact). Both have 4GB RAM, run Win7/Aero with aplomb and are even capable of light gaming. Tack on the inexpensive LCD's I bought for both machines (entry level viewsonic 19") and total power consumption for the entire system is about 120-140w under use (my mother has already turned the brightness up to about 85% so she can 'read' better, which basically puts the backlight near maximum).

My wife's PC is probably going to get a 4850 since she only plays a few games (including left4dead) but the machine is actually perfectly capable of playing all of her games off of the 780G igp as is, just with features turned down (Left4dead is set all the way down except for paged memory settings.) I'm really looking forward to seeing how the 4850 & IGP do 'hybrid' crossfire. Ie, I'd like to see what the sleep/hibernate/idle power consumption is with this card in place.

It's nice to know that I *could* give her a 4890 if I so chose too...and it's also nice to know the 780G is capable enough on its own for casual use.

I bought a 4850 due to its sheer price/performance ratio. It overheated and died on me within a month, leading to a month of arguing with the manufacturer about an RMA (i won eventually). When I got it back, they had installed a dual slot heatsink on it WITHOUT ME ASKING which is quite indicative that they KNEW they had an issue with temperatures / inadequate stock heatsink.

Also the holdup over the RMA was due to a missing sticker that they insisted was essential, and guess what, the sticker had fallen off the card (geeze a sticker falls off something that goes up to 70 degrees plus? you'd never thunk) and got lost in the case behind the mobo. I eventually dug it out and then they relented on the RMA (and gave me that nice dual slot heatsink to boot).

Every single catalyst driver upgrade has been nothing but trouble, including several where to work without a BSOD upon login, you have to boot into safe mode, disable an ATI service that the installer activates (which you have no option of excluding), then it works. And yes turning off that service has zero impact on anything (so WTF was it for). Read the ATI forums or any graphics card enthusiast forum (overclock.net and the like) if you don't believe me.

Finally Crossfire is a complete dog compare to SLI, at least in the getting it to work department, also in driver support (and upgrading to latest Catalyst drivers to get crossfire to work with game X.... you see the chicken and egg situation). Why do I have crossfire? because I had to wait a month for my RMA and I didn't even know whether I'd succeed, so I bought another 4850 and custom modded an accelero heatsink onto it, guess what, both the 4850s I have now with custom heatsinks on both of them now have zero issues aside from drivers. THEY KNEW THE 4850s HAD DODGY TEMPS FROM THE START AND DIDN'T DO A THING ABOUT IT.

Based on those experiences I am not touching ATI with a 20 foot pole for my next upgrade, dodgy Nvidia review shenanigans or not. At least when I install a nvidia card and its drivers, it works.

Every single catalyst driver upgrade has been nothing but trouble, including several where to work without a BSOD upon login ... Read the ATI forums or any graphics card enthusiast forum (overclock.net and the like) if you don't believe me

Were you overclocking the card or CPU, or using aggressive RAM timing?

I've had no problems with Catalyst drivers since I got my 4870 last June, but I run all of my parts at stock speed.

...and next week nvidia will have something faster. Then the following week ATI makes something better. Rinse and repeat ad nausim. Who gives a crap anymore? I'd be more impressed if a different company made a faster card (or AS FAST) as nvidia or ATI. Otherwise use whatever brand your a fanboy of.

No, for home theatre you need a card that has modern codec support in-gpu/driver and runs cool, preferably fanless.

As for the 4850 problems above, you should note that cooling is 1. dependent upon your specific installation's airflow and 2. often just as problematic on Nvidia. In Nvidia's case the only time you'll notice gpu issues with the heat is when the card has marginal ram on it (which does happen) but the stock fan speeds are set so low that the cards generate a LOT of slowly dissipated heat when gaming.

I'm with you on the driver issues though, I won't run ATI in my primary machines for exactly that reason. My wife & mother don't have a clue *what* driver they're running, they only care that it's stable. Incremental support for the latest titles & gaming features can be ignored in favor of a known-stable driver.

Dave simmons: my CPU is OCed but i never oced the GPU or RAM, software or hardware wise.

RAM timings and all voltages are at stock/auto , the CPU OC is a simple FSB change, I left everything else on defaults.

Oh and I forgot, that's my current PC, the PC the 4850 blew up in was stock in every single way.

Valis, the case is a Coolermaster CM690 (brilliant case) with three fans including one blowing directly onto the GPU (side panel) so I'm confident its not cooling, speedfan reported all other temps just fine. And in case you were wondering, thermaltake 750W PSU so not that either. Anyhow with the custom coolers I'm getting ~40 degrees idle and ~60 degrees load on both cards, which is a good 20 degrees cooler than out of the box.

I'm v happy with the card's performance, just not happy with the issues caused by overheating with my stock card (which they may have ironed out with subsequent revisions or BIOS tweaks). Being an ATI user yourself you will probably have read about the infamous fan speed tweak whereby with v1.0 bios and original drivers the GPU fans would not spin up until a ridiculous threshold was reached, users were reporting 70 degrees idle temps as not uncommon!!! I didn't do that tweak that on my original card because like the fool I was I assumed if I left everything on defaults and clicked next-next-next on driver installs all would be well.

People go through that with hard drives: If a WD drive fails and you were too lazy to back up it must mean WD is a shoddy, data-stealing weasel company. But the guy next to you had it happen with a Samsung drive, and the woman down the hall lost everything on her Seagate. Every brand fails for somebody.

I've had no trouble with either nvidia or ATI cards, so I'll conclude both vendors are 100% reliable -- it makes exactly as much sense.

Llama's experience mirrors my own very closely. I've owned a GF2GTS, GF6800GT, GF7950GT, and most recently a GF8800GT without any trouble over the years. In between the GF2 and the 6800 I owned a Radeon 9800XT, which caused me some grief, and ran extremely hot. Nothing excessive (had to replace the fan after I burned myself on the card trying to figure out the cause of some BSODs), but more than all the other cards I've owned combined. I also very much disliked their drives (required .NET, utterly craptastic UI, etc), but whatever. I've had several people mention they've improved things over the years (and their drivers for my TV tuner do seem to be pretty good), so I'm feeling pretty optimistic. It's just there's a significant population of us who've been burned (in some cases literally) by ATI cards in the past.

In my experience with both brands neither uses adequate cooling on their cards, the first thing I do when I get a video card anymore is take off the fan/heatsink and put some decent thermal compound on the chip and make any other mods that might be neccesary. I have a 4850(Sapphire) on my recent system and the temps were barely acceptable at 59C idle 72C load, so I bought a third party cooler for it and it stays 41C idle and 52C load.

Of course I'm not expecting everyone to do this, as it voids the warranty, and I think It's stupid that these companies don't put a good cooler on it to begin with.

My advice is to look for brands like Asus or MSI that have good coolers on them out of the box, they may be a bit more expensive than the reference models but it's worth it.

And make sure you have a decent power supply, or you will have problems.

As far as drivers go I've never had any major problems with either company.

The proprietary general programing environments are not out yet, I have looked at programming in CUDA, Stream and OpenCL; OpenCL requires more work to program as you are required to do more management as a programmer. So the project has to have a serious payoff of cross-GPU compatibility to justify the extra cost/time.

Another thing is you will get faster results if you make two versions, one for nVidia and one for ATI. Since a large part of getting speed out of a GPU is tailoring to the devices architecture. When you aim for compatibility with OpenCL you guaranty a lower theoretical speed vs the native programing environments (excluding they some how make a decide to redesign their architecture, currently only the interface to the GPUs through DirectX and OpenGL are similar, how it process from that is not).

On the comment by Ravedave that OpenCL will replace CUDA - that's kind of silly to suggest because OpenCL is basically identical to the driver API version of CUDA. Also, as mentioned a few posts before, OpenCL, given its more general nature, will end up being more work for most programing situations. Like them or not, NVIDIA has marketed CUDA extremely aggressively and much more effectively then ATI and their product is far more polished.

Originally posted by EdipisReks:the anandtech review is interesting in that it is the only major review of the 10 or so i looked at that doesn't show the 275 outperforming the 4890 in almost every metric.

Considering most of my daily usage is 2D, I'm not seeing the motive for an upgrade from a HD3650-- it already does as much hardware video acceleration as any other card and that was the primary reason it replaced a 7600GS.

What would appeal to me is using the process improvements to deliver better thermal performance. I want a card that stays under 50C load, with a single-slot fanless cooler. But it seems everyone just wants to make a faster high-end card, and in some way, even if they make a new low-end card of similar performance, it's crippled in some way instead.

I don't get the GPGPU appeal at all. Yet. The applications are too narrow, at least presently, and the hardware performance is too variable, at least presently. (I understand, if I want the most Folding@Home points, it isn't necessarily the same card to buy if I want the most efficient hardware video processing, or the fastest game frame rates).

Originally posted by EdipisReks:the anandtech review is interesting in that it is the only major review of the 10 or so i looked at that doesn't show the 275 outperforming the 4890 in almost every metric.

Valis, the case is a Coolermaster CM690 (brilliant case) with three fans including one blowing directly onto the GPU (side panel) so I'm confident its not cooling, speedfan reported all other temps just fine. And in case you were wondering, thermaltake 750W PSU so not that either. Anyhow with the custom coolers I'm getting ~40 degrees idle and ~60 degrees load on both cards, which is a good 20 degrees cooler than out of the box.

I'm v happy with the card's performance, just not happy with the issues caused by overheating with my stock card (which they may have ironed out with subsequent revisions or BIOS tweaks). Being an ATI user yourself you will probably have read about the infamous fan speed tweak whereby with v1.0 bios and original drivers the GPU fans would not spin up until a ridiculous threshold was reached, users were reporting 70 degrees idle temps as not uncommon!!! I didn't do that tweak that on my original card because like the fool I was I assumed if I left everything on defaults and clicked next-next-next on driver installs all would be well.

No I didn't know that particular tweak, I was merely considering an entry level AMDAti for my wife's machine. I will keep an eye out for thermal issues when I do go to purchase something for her. My main gripe with ATI has always been drivers, and I think that's one area where there's plenty of evidence, but for my wife the cost is right.

All 4 of my last nvidia cards actually ran close to 80C in my machine (I have fb-dimms which are also hot) even with more than adquate cooling, this is just how the cards are set to cool via driver for 'noise' reasons. I prefer to see temps closer to 50-60C when gaming & under 40C at idle so I take matters into my own hands. I use evga's Precision to up the fan speed on my GTX260/216 55nm (evga), and I upgraded to this from a 9800GTX largely for the reduced thermal savings with the more aggressive 2d downclocking the gt200 nvidia cards have. I'm less conserned about the effects on teh card itself than I am to my overall systems' longevity...

As for cards shipping with bad ram and overheating, I've actually had i2 failures in the last 8 years (both within 10 days so obviously manufacturing defect) and both were actually nvidia, but it was the ram that failed and I don't consider that unusual given how ram is lot-tested & binned.

Originally posted by wintermute000:I bought a 4850 due to its sheer price/performance ratio. It overheated and died on me within a month, leading to a month of arguing with the manufacturer about an RMA (i won eventually). When I got it back, they had installed a dual slot heatsink on it WITHOUT ME ASKING which is quite indicative that they KNEW they had an issue with temperatures / inadequate stock heatsink.

I bought two VisionTek 4850's last year in the first week they hit my local retailer's store shelves. They are still going strong--no hardware problems at all. The two I bought were built by ATi and simply re-badged with the VT brand logo.

ATi made it clear when the 4850's shipped that the gpus were rated to run at their stock shipping temperatures indefinitely without a problem. Most OEMs, when doing RMAs, grab whatever they happen to have in stock, either new or refurbished, to use as the replacement card. AFAIK, most 4850's being shipped today still utilize single-slot cooling, which would seem to indicate to me that you are guessing that temperature was your culprit. Both of mine have but single-slot cooling and they are still going strong.

It's at least possible that your card OEM sent you a newer so-called "overclocked" 4850, with a dual-slot heat sink, simply because he didn't have any more vanilla 4850's at hand. "Overclocked" in this case doesn't refer to you having done anything or changed any settings: it refers to the OEM shipping the product at a higher clock than stock, which means that it *always* boots at a clock some 25MHz to 50MHz faster than stock, usually.

quote:

Also the holdup over the RMA was due to a missing sticker that they insisted was essential, and guess what, the sticker had fallen off the card (geeze a sticker falls off something that goes up to 70 degrees plus? you'd never thunk) and got lost in the case behind the mobo. I eventually dug it out and then they relented on the RMA (and gave me that nice dual slot heatsink to boot).

Chances are the sticker wasn't even close to anything on the card running at 70C... Stickers in general have been known to fall off of anything after awhile. That happens when the glue holding the sticker on breaks down. It would have been different if you'd related a story of your sticker igniting and burning up...

Your experience illustrates why I always buy my stuff locally whenever possible: if I have a problem there's no worry over RMA's because I go back to the purchase source locally and let them handle it.

quote:

Every single catalyst driver upgrade has been nothing but trouble, including several where to work without a BSOD upon login, you have to boot into safe mode, disable an ATI service that the installer activates (which you have no option of excluding), then it works. And yes turning off that service has zero impact on anything (so WTF was it for). Read the ATI forums or any graphics card enthusiast forum (overclock.net and the like) if you don't believe me.

I don't believe you because I know better: It isn't true at all to state that "every single catalyst driver upgrade has been nothing but trouble," although I did have installation problems with Cat 8.9-9.2, which represents the last 4-5 months of driver updates. Under Vista x64, CAT 9.3 installed perfectly, right over CAT 9.2, and did not require a reboot. Also, I'd like to add that even though I had installation problems with the above-mentioned CAT versions, in each and every case I was able to resolve the problem and install the drivers correctly. As I say, all of the CAT version prior to 8.9 installed without difficulty, and with 9.3 the problem has been resolved--at least for me.

BTW, the service you mention has an obvious purpose. It is what allows you to turn Crossfire on and off without rebooting in between. Beginning with CAT 8.9 or so, ATi changed something in the CATs which conflicted with the operation of the ATi driver service, and that's what caused the problem. In every case ATi was quick to issue a hotfix that resolved the problem, and as of CAT 9.3 it appears the problem has been laid to rest.

quote:

Finally Crossfire is a complete dog compare to SLI, at least in the getting it to work department, also in driver support (and upgrading to latest Catalyst drivers to get crossfire to work with game X.... you see the chicken and egg situation). Why do I have crossfire? because I had to wait a month for my RMA and I didn't even know whether I'd succeed, so I bought another 4850 and custom modded an accelero heatsink onto it, guess what, both the 4850s I have now with custom heatsinks on both of them now have zero issues aside from drivers. THEY KNEW THE 4850s HAD DODGY TEMPS FROM THE START AND DIDN'T DO A THING ABOUT IT.

Based on those experiences I am not touching ATI with a 20 foot pole for my next upgrade, dodgy Nvidia review shenanigans or not. At least when I install a nvidia card and its drivers, it works.

You sure you aren't employed by nVidia? Pardon me for pointing out that your statement above makes no sense... The only way for you to know anything of worth about SLI is to own and run SLI yourself, and as you profess to know all about SLI, and profess to know how much better nVidia SLI is than ATi's Crossfire, then the following must be true:

(1) You own sets of both SLI and Crossfire hardware

(2) Your experience with your ATi Crossfire hardware has been horrible, while your experience with nVidia hardware has been nearly euphoric.

(3) Even so, you insist on running your Crossfire hardware anyway.

I think you'll agree that what you've said simply makes no sense. Possibly, for you, it could be that running your horrible Crossfire hardware coupled with the experience of writing how about how much your Crossfire hardware sucks is simply more enjoyable for you than running your SLI hardware in 3d games.

If not, then perhaps it may be true that you don't in fact own any SLI hardware and that your entire SLI "experience" stems from WHAT YOU'VE READ ON THE INTERNET?

Suffice it to say that during the last 9-10 months running my 4850 Crossfire hardware, and playing lots of 3d games, that I've never been given any reason to (a) think it's horrible or (b) imagine nVidia SLI is so much better. In fact, my 4850 Crossfire experience to date has been so good that I don't pine for anything else, whether it's from ATi or nVidia.

I've yet to see anything using CUDA truely worthwhile for regulars users. The two or three GPU encoders using CUDA produce videos with inferior quality compared to x264 and with not much in speed benefits.

AMD demoed Havok through OpenCL and Nvidia have all but said they will support OpenCL for PhysX if things go that way. AMD also demoed OpenCL for workstation applications using their CPU's at siggraph.

CUDA seems to me like a lot marketing hot air with users waiting for that killer app which has not materialized.

Apple with wage a PR offensive with OpenCL and their next OS and Microsoft will be pushing Direct-X Compute Shaders in DX11 so CUDA will quietly fade away just like Nvidia's own in house high level shader language that came before Direct-X 9's HLSL.

So I would not base any GPU buying decisions on whether it supports CUDA or not.

Geze WaltC sorry to get you on a bad hair day, or whatever nerve it is I have trod on.

I've never run SLI, but I don't recall making any comments about how great SLI was, I just know that ATI drivers are a pain, and that temps on my stock 4850 were insane, and that I have always had ATI driver issues, and whenever I have had Nvidia cards I've had far less issues. Of course SLI has issues but everybody I've talked to that's run both has said its less painful (note I am talking about PAIN FACTOR in messing around to get it to work, not performance once you have it working). Anecdotal I know but nevertheless.

The crossfire is an 'accident', as I stated I didn't know whether I'd win with my RMA request, and I was hedging my bets: If I won then I could try crossfire. If I bought a Nvidia card then the 4850 comes back then I have something useless on my hands, also maybe it was a bad one-off situation etc.

YMMV and you appear to have a pain free ati experience. good for you. I did state categorically that I AM HAPPY WITH PERFORMANCE AND VALUE. But geeze it took a lot of futzing around.

Don't even get me started with ATI drivers in linux. They can't even get video to play with compositing desktop (that's 'aero' for you wintel only people) properly. They've been working on the problem for over a year (since it became widespread known in the nix community) and they STILL haven't fixed it.

"...so I'll conclude both vendors are 100% reliable -- it makes exactly as much sense."

dave simmons you see the issue with what youre saying ? you make no logical sense. semantics aside (vendors ?), humans are imperfect, humans designed... XD.

whats the correlation with drives failing and not making a backup ? what ? not related. deathstar ring a bell ?

i too wont go with ati. my last good, great ati card was a 9800. since then it nvidia because of the driver support. cats fucking suck imo. hardware is so close in perf that for me its about drivers and stability.

I purchased an HD 4780 just a few months ago. I'm pretty happy with it. Near as I could tell when pricing my GPU, these titans were very evenly matched, with heated debate on both sides as to which is better.

In my mind, nVidia dealt a huge blow lately with the advent of the GeForce Vision 3d (http://www.nvidia.com/object/GeForce_3D_Vision_Main.html) package. The system requires monitors that aren't readily (nor cheaply) available as yet, however this technology ONLY works with high end nVidia GPU's.

I can only hope that AMD will release an iteration that will work with my hardware eventually. If not, I'm afraid my pursuit of this will prohibit me from buying Radeon GPU's in the future.