Why not adding more real life computing tests like iRay that runs both for CUDA and OpenCL?Syntethic tests are really meaningless as they depends more on the particular istructions used to do... ermm.. nothing?Reply

I have one question why didn't you use 2 R9 290X with water cool or 2 GTX 780Ti with water cool. I hate this marketing Mumbo Jumbo. if I want to pay this money I will chose two cards from above with water cool and with some OC work they will feed this card the dust and for the same money I can buy 2 R9 290 or 2 GTX 780.Reply

Emphasis on "moderately good". Years of operation will require maintenance on any LC solution. Moderately priced solutions such as Thermaltake don't hold up as well, the liquid gets discolored, tubing cooks and becomes brittle, pumps fail etc... Liquid cooling isn't something to do on the cheap.Reply

Hello , can someone please advice , i have 4930 K , OC to 4.3 Ghz , 16 gig ram also OC ( slightly ) and i am planning on getting the r9 295x2 . Will the power supply i have be sufficient for this card ? This is my PSU - Cooler Master V850 - 850W . THanksReply

A pump, a reservoir, two waterblocks, a fan/pump controller and a plethora of connectors and tubing will probably end up costing the difference between an R9295X2 and two R290X water cooled. Remember, with the R9195X2 you get a free closed loop that requires no maintenance. After succesfully running dual HD6970s in a koolance loop for 3 years, I can say a free closed loop is a great selling point for me.Reply

It'll be interesting when a card like this does not have 2xGPU + 2x4GB but 2xGPU + 8GBEventually I guess we'll see the results of the HSA push trickle down (up?) to performance parts like this.Hey, maybe we'll see a 4xGPU + 16GB...Reply

No, they don't have the same limitations. Since you don't have to populate two PCIe slots, you can install the 295X2 onto a microATX or ITX motherboard and case, or just use it in a regular ATX board that doesn't support x8/x8. For some, it could also matter to not have to populate a second PCIe slot and use additional PSU cables, even if their motherboard and PSU fully supported crossfire 290X. The ability to use only one PCIe slot has always been the primary selling point of any dual GPU card, ever.Reply

If you want to pay 1500 $ for video card and 2000+ for 4K monitor and 200+ for power supply then you are the one who pay 1500$ on core i 7 4960x with ROG X79. why should you think about 8x/8x motherboard lanes don't forget 32 GB 2400+ DDR 3 and two extra fast ssd . friend you needn't a case you need a cabinet Reply

$2k+ for a 4k screen? where are you wasing your money? In norway, you can get a 4k screen for just about 5kNOK, or just about 850USD, including tax! also, why would you need a $1500 CPU, whene the 4930k is 200MHz slower, for half the price?

Also, WHY would you want 32GB of 2400MHz ram!?!?!?! There is next to no improvement over 1600MHz!

As far as SSD's goes, a single samsung 250/500GB should be plenty, you got 32GB of ram to use as buffer!

And if you want a "tight" system with insane preformance, the 295x2 is the best choice ATM. Double the 290x preformance, "half" the size. Reply

Another difference is the way this card handles heat compared to any 290X CF setup apart from custom water cooling. The CLLC combines the benefits of reference GPUs - the ability to exhaust hot air externally rather than into the case - with the benefits of third party cooling - the ability to keep temperatures and noise levels lower than those of reference blower cards. A 290X crossfire setup using reference cooling is not even worth considering for anyone who cares about noise output, while third party 290X crossfire is restricted to cases with enough cooling capacity to handle the heat.Reply

You are right, but keep in mind on big limitation with normal crossfire/SLI is the space taken up by 2 big dual slot GPUs, with this it is only one slot; however other than that you might as well get 2 290x'sReply

Interesting. [H] seems to have done some pretty thorough testing, and the AMD card blows by 780Ti SLI in every single case. Of course, [H] is testing @ 4k resolutions/3-way Eyefinity exclusively--but that's where anyone who shells out this kind of money is going to be. 1080P? Don't make me laugh...;)Reply

Not seeing the bias--Anandtech is usually pretty fair. I think you have overlooked the fact that AMD is a sponsor not NVIDA. If anything "slating" Titan Z would be more consistent of your theory of "selling out." Reply

http://www.anandtech.com/bench/product/1187?vs=107...Two 780ti cards are cheaper than the 295x2, that's a fact.Two 780ti cards consume much less power than the 295x2, that's a fact.Two 780ti cards have better frame latency than the 295x2, that's a fact.Two 780ti cards have nearly identical performance to the 295x2, that's a fact.

If someone was trying to decide between them, I'd recommend dual 780ti cards to save money and get similar performance. However, if that person only had a dual-slot available, it would be the 295x2 hands-down.

The Titan Z isn't really any competition here - the 790 (790ti?) will be the 295x2's real competition. The real question is will NVIDIA price it less than or more than the 295x2?Reply

I don't think the target market for this stuff (295x2 or Titan Z) are single GPU slots, as Ryan briefly mentioned, most people who are quite poor (myself included), will go with 780TI x 2 or 290x x 2, These cards are aimed at Quads.

AMD have priced it appropriately, roughly equal perf. potential for 3k dual 295x2 vs 6k for dual titan-z. Unfortunately, 4GB may not be enough for Quads...

I've ventured into multiGPUs in the past, I find these rely too much on driver updates (see how poorly 7990 runs nowadays, and AMD will be concentrating their resource on 295x2). Never again.Reply

With respect, any decision on what to buy should made but what your application is. Paper facts are worthless when they don't hold up to (your version of) real world tasks. Personally I've been searching for a good single card to make up for Titanfall's flaws with CF/SLI. Point is, be careful with your recommendations if they're based on facts. ;)

Sidenote: I managed to pick up a used 290x for MSRP with the intention of adding another one once CF is fixed with Titanfall. That price:performance, which can be had today, skews the results of this round-up quite a bit IMO.Reply

By drawing that much power from the PCI-lane, won't it be a fire hassard? I'v read multiple post about motherboard which take fire at bitcoin/scryptcoin mining forums due to using to many GPU without using a power riser to lower the amount of power delivered trought the pci-lane.

Would Anandtech be willing to test the claim from AMD by running the GPU at full load for a longer period of time under a fire controlled environment?Reply

Well, not, not exactly. One thing is not being PCI compliant, and that's a thing I can understand. Another thing is going beyond connectors electrical power specifications. If they put 3 connectors I would have not had any problem. But as it is they are forcing components specifications, not simple indications rules on maximum size and power draw.Reply

If you look at the datasheet for the power connector (I'm guessing on the part number but the Molex part linked below should at least be similar enough), each pin is rated for 23 A and the housing can support a full load on each pin. Even if only 3 pairs are passing current, the connector can deliver over 800W at 12V.

The limiting factor for how much power can be drawn from that connector is going to be the copper width and thickness on the PCB. If AMD designed the board to carry ~20 A (which the presumably have) off each connector it won't cause a problem.Reply

Most of the power will be coming from the PCIe power connectors, not the lane itself. If you have 5/6/7 in a single system, then yes you might start to see issues without the appropriate motherboard power connectors.Reply

You'll want to get the most efficient PSU you can get your mitts on, though.

Also, I would seriously consider a system that is kicking out 600 Watts of heat to be something you wouldn't want in the same room as you. Your AC will work overtime, or you'll be sweating your ass off.

A GPU for Siberia! But then, that's not really a downside as such, just a side effect of having a ridiculous amount of power pushing at the edges of this process node.Reply

The card is watercooled!! not aircooled like Nividia chose to do with their 500W TitanZ. Ishould run very quiet, and should not affect your internal temps much, as long as you mount the radiator externally to your case.Reply

Great article. I wish I had a spare grand and a half to replace my 7970 CF set up. BTW It looks like you have an extra GPU Load temp chart on Page 17 where the Load noise chart should be positioned.Reply

I may have missed it in the article but I don't think you mentioned whether or not it would be possible to add an additional fan on that asutek cooler? This would surely bring down stock temperatures (albeit increase the noise) if one was thinking about overclocking further.Reply

Yes, it's possible. You would need to come up with a matching fan and the screws to mount it, but there's nothing from a hardware perspective keeping you from mounting a second fan for push-pull. I don't know if it's easily visible in our pictures, but the fan power connector is exposed mid-way along the cable run, so you can split it there to get a second fan power header.Reply

Its out of spec, as also discussed in the article, but very unlikely to be dangerous. 215W at 12V over 3 wires will draw about 6A per wire, still significantly below what the ATX standard considers safe:"Under normal or overload conditions, no output shall continuously provide more than 240 VA".

The interesting question will be if the PSUs are up to the task of filtering the rapid switching between almost 0W and up to 425W on the rail which feeds the two connectors. But most modern, high power PSUs feed even more connectors on a single rail, so they should have little problem. And whoever mates a 1500$ GPU with a cheap PSU has nobody to blame but himself.Reply

You may look at those frame per seconds graphs more closely. When provided, those graphs do not show good frame pacing timings, and where not provided, look to other site's reviews.But then, I'm sure who ever spend $1500 for a gaming card has all the reason to convince himself that the card is good nonetheless.Reply

Sorry, but what kind of graph are you looking at?First, not all reviewed games here have a frame pacing graph.Second, look at BF4, Crysis3, Thief graphs for example. Where do you see this card being better than a SLI of 780Ti?The FPS graphs are a mess for this 295x2. Ideally those graphs should be a thin line, not an area where the frame per seconds continously oscillate.

But, well, as said, once you spend (or want to spend) $1500 you have to convince yourself there are no problems. The same was true for 7990 buyers that glorified that card. See now what a crappy card it is. And that was also true for all those that negated that AMD Crossfire configurations had problem with respect to nvidia ones before AMD tried to correct the problem with new drivers (that somewhat now work for DX11 games but not with DX9 and 10 ones, and that's maybe the reason that for older games there's not the frame pacing graphs...).Reply

That would require either huge amounts of binning that drives price right up (like the Titan Z), and/or significant reductions in clock speed to accommodate reduced voltage (almost certainly like the Titan Z), resulting in a card that's both overpriced and underpowered (like...). Of course, it's not really fair to compare a card that's with reviewers now and on the shelves in 2 weeks with a card that has only ever been seen as a mockup on one of nvidia's slides =).Reply

Well, Arctic's 6990 cooler wasn't far off. The arctic mono is good for 300W and it should be possible to fit two such heatsinks on one card. So it's possible. The resulting card would be absolutely huge though, and wouldn't be nearly as popular with gaming PC boutiques (IE the target market).

Huh looking at that board and layout of the cooling setup you can swap in two independent closed looped coolers pretty easily and try and overclock it if you want and since your rich if you buy this it's totally viable for any owner Reply

Word of warning: do not use daisy-chained PCIe power connectors (i.e. one connection to the power supply and two 8-pins to the graphics card). If AMD wasn't going over the per-connector power spec it wouldn't be an issue, but they are, which means you can melt the connector at the power supply end. Those daisy-chained PCIe connectors are meant for 300W max, not 425W.

We've been hearing about this from a bunch of partners and I believe end users should be warned.Reply

Single cable is beyond spec for the connector. We've been hearing connectors actually melting. "Crappy" isn't really relevant here; this is the *only* card on the market that causes these kinds of problems.Reply

This is all very nice, but unless case space is at a premium, I fail to see the advantage of this card over two 290X cards with good coolers. The PowerColor PCS+ version of the 290X runs at 1050 MHz, is much quieter than the reference boards (40-42 dBA under load at 75cm), and is available for under $600. Is having a single-slot solution worth $300 extra? Not unless you really want have everything in a small form factor case.Reply

After some additional research on the web, it looks like the difference in temps between the 2 GPUs is only about 2 degrees under load, so pleasantly surprised with how well the 120mm radiator handles the cooling.Reply

The temperature readings come from MSI Afterburner, which is capable of reading the temperatures via AMD's driver API. And unless otherwise noted, the temperature is always the hottest temperature.Reply

Did you misread the article? They are simply comparing the frame pacing on the old stuff to the new stuff. Unfortunately, most people are too stupid to properly comprehend english, which is pretty damn sad if you ask me. Thus, a lot of people are either mistakenly thinking that this card has bad frame pacing, or that this review had anything to do with the frame pacing updates for GCN 1.0. NEITHER of those things are the case!Reply

These two different things really shouldn't have been in the same article. It's confusing, unfocused, and comes off as taking cheap shots at AMD over an old product. Let's be honest, there weren't many 7990s sold in the first place, and anyone who bought one for gaming and was disappointed with it could have resold it during the mining craze and at least broken even, if not actually turning a profit. A review of a new product isn't the best place to say "Old product X is still not perfect".Reply

my question is: is there going to be a GTX 790 or that plan is gone? i would like to compare similar things,and then decide what i m replacing 690 with.I ve waited long enough it's time for an upgrade.Reply

I like what AMD has done here, paying more attention to the high end packaging is good to see. Hopefully this trickles down to the next generation on 20nm. Considering how far AMD has been behind NVidia in terms of power consumption, seeing this card average under 500W at full load is pretty good. That's under 250W per chip, pretty close to what a 780ti does. You pay a slight premium for the better case and clc and still better than what NVidia wants for the Z.Reply

Shame there is still no HDMI 2.0 support on any consumer GPU's, including this one. Given that this is arround $250 more then two custom cooled 290X's with similar noise profiles it doesn't make much sense (at least to me) in standard or full size cases, but would be an interesting choice for a HTPC. Card is overkill for 1080p, but with new 4K TV's coming out that support HDMI 2.0 this would have made sense for that. As it is, while I can appreciate the design and performance, I don't see the value prop vs a couple of individual 290x's in CF.Reply

What is the static pressure of the 120mm fan? Is it easily replaceable? I'm wondering if it would make sense to replace it with an SP120 HP edition, a Noctua, or any other high static pressure/low noise fan (even if it would require modding it in by cutting wires).

The review was excellent. Very informative and interesting. Took me a whole hour to read. Haha. I just wish the card was short enough to fit in the Obsidian 250D. Now for the Ares III!Reply

huh?, no giveaway? why do I read this stuff?ok, seriously, love these reviews, but the thing I never understand is when they say Titan is better, but the charts seem to say the opposite, at least for compute.Reply

The sad thing about all this, is that the lowest resolution for these cards is considered to be the 2560x1440 one(for those who understand).Bigger disappointment yet, that after so many years of high expectations, the gpu still stands as a huge piece of silicon inside the pc that's firmly chained by the IT industry to serve gamers only.Whatever the reason for no such consumer applications,thiis is a crime, mildly put.Reply

People should be warned: the performance of this card is nowhere close to what the benchmarks or limited tests suggest. Even on the newest Asrock Motherboard the PCI v3 lanes bottleneck this about 40%. If you're just going to sequentially transform the same data once it's on the card, yes, you have this performance, which is impressive for the base cost, though entirely lousy for the Flop/Watt. But, if you're going to attempt to be moving 8GB of data to and from the CPU and GPU continuously, this card performs marginally better than the 290. The busses are bridge chips are going to need to get much faster for these cards to be really useful for anything outside purely graphical applications in the future. It's pretty much a waste for GPGPU computing.Reply

I'm trying to imagine what kind of person would have 4 of these and why, maybe EyeFinity with 4k? Even then your CPU would bottleneck way before that, you would need some kind of motherboard with dual CPU slots and a game that can take advantage of it. Reply

Hello , can someone please advice , i have 4930 K , OC to 4.3 Ghz , 16 gig ram also OC ( slightly ) and i am planning on getting the r9 295x2 . Will the power supply i have be sufficient for this card ? This is my PSU - Cooler Master V850 - 850W . THanksReply