Post Your Comment

107 Comments

Didn't AMD originally supply an info-graphic (about a year ago, just prior to the release of their 7000 series cards) that the 7790 card was to have been their lowest-priced 7000 series card that still provided a 256-bit memory interface? Any explanation for the downgrade now, to a 128-bit interface?Reply

Amd lies all the time then fails to deliver, so the fanboys give them a thousand breaks, another chance, some more time - " in the future what AMD I bought today will be great, it's "futureproof" " (even though it doesn't work correctly right now)...LOL - goodbye amdSo the beta driver, uhh perma beta driver must have crashed out in civ5 a lot.Reply

im sorry but if you must insist on invalid staments you shouldnt be on this website because smarter people will correct you on YOUR fanboyness. First of all amd beats nvidia at every price point(check linus tech tips on his youtube video card chanell) and they have perfect drivers. working for 6 months on the same amd driver and no crashes. infact i tried a new crysis 3 driver and it didnt crash then either and that was a beta driver. and my radeon 7770 destroyed the nvidia gtx 650, which is at the same price point.Reply

A roadmap is nothing but a projection of what is PLANNED for the future, not some kind of "promise" or "guarantee". Calling AMD people liars because the released product didn't match the projection is childish at best.

And before you slap the "fanboy" label on me, I prefer Nvidia generally speaking (but I'm not going to cut off my proverbial nose to spite my face in order to be brand loyal; if AMD has the current best solution for my purposes, I'm going to buy AMD).Reply

They were going to use a cutdown Pitcairn, being 7870/7850 GPU, and cut down the GPU core to use excess cores that couldn't make the cut as 7870/7850s.They might have gone with 256-bit to simplify the product for AIB partners who could just re-use their HD7850 designs, rather than needing a new design for a smaller run product.

The 7790 now is a new GPU designed to be cheaper to produce (as it's smaller) than Pitcairn, and the fact the memory can run at 6GHz is probably due in part to the fact it's a new GPU rather than a cut down Pitcairn.Reply

I don't see a launch date in the whole article, it's NOT available. I guess that's another mystery freebie for AMD's products here. Didn't see port config either, so what cabling do we have to buy to run 3 monitors when Asus 650ti runs 4 out of the box, 3 with dvi and vga only ? Not impressed with the huge AMD biased game line up either, so expect your mileage to be less than shown. No overclock talk really either - so it must blow at that.Other sites are reporting amd's beta driver, so maybe they won't even have a release driver for this card when they release it, as AMD is often known to do, for like a year sometimes or forever in terms of any sort of quality-LOL. Civ5 has only 1 bench rez, it must have crashed in others. Crossfire ? Article didn't say.Multi-monitor - no talk of that anymore since nVidia SPANKS amd to death on that now. Hopefully you've fooled the internet tards again, because amd is bankrupt, for good reason.Reply

Did you even read the article?-Launch date is mentioned on page 1, in one and a half week-Ports are clearly visible and standard, 2 DVI + HDMI + DisplayPort-Lineup is consistent with every other review on Anandtech.-There's an entire page on the new PowerTune and how it impacts overclocking, single sample OC investigation is irrelevant and best left for a dedicated vendor comparison.-... really?Who's the real tard here?Reply

No, that's what you do all the time. But thanks for the compliment, since you know I always read the articles completely, yet you think I didn't this time, WRONG. I've made a lot of money this past short week without a lot of rest, so I'll give you and dipsy doodle a point on the svengali launch date the article writer for the first time EVER declares "solid" before it even occurs, og wait, he always does that when it's AMD, but if it's nVidia he says we'll have to wait and see as they are probably lying...ROFL Who cares, the card sucks, amd is dying, the drivers blow beta chunks, and amd is way late to the party.Reply

You must've missed the part about them simply not having as much time to test the 7790 as they'd have liked because they were at GTC. Other sites apologised for their lack of time as well.

There's a whole load of other reviews out there; only a few have overclocking results (Guru3D notably), and as far as I can see only AT, of the major sites, has both the 7790 and a factory overclocked 7790 in the same test. Guru3D is alone in providing a CrossFire test and though two 7790s perform about the same as a sole 670, there's no power readings. There's a good number of different titles being benchmarked so it's not strictly a list of AMD-says-test-these-titles, plus Tomb Raider, a Gaming Evolved title, performs better on NVIDIA hardware. There's a few bugs with the beta drivers used for the 7790 in these reviews most notably with latencies (a bug that has already been fixed with the next Catalyst release so yes, we will see new drivers soon), however the latency values are so far ahead on average of what we used to see from AMD that this can hardly be classed as an issue. Testing has generally centred on 1920x1080 because that's really the limit where cards like this are supposed to be performing - there's little point in 1024x768 and an equal measure of futility trying for 5760x1080 or whatever; the former is ridiculously low res and the latter is ridiculously ambitious even for a 7970 or 670/680.

Sapphire's blurb about multi-monitor usage via the TweakTown website:

"Working or gaming with multiple monitors is becoming increasingly popular, and the SAPPHIRE HD 7700 series supports this with AMD Eyefinity, now in its second generation. The SAPPHIRE HD 7790 OC Edition has two DVI ports (DVI-I and DVI-D), HDMI and a single DisplayPort output, supporting up to four monitors.

The SAPPHIRE HD 7790 OC Edition model supports the FleX feature, pioneered by SAPPHIRE, that allows three digital displays to be connected to the DVI and HDMI outputs and used in AMD Eyefinity mode without the need for an external active adapter. All four outputs can be used in AMD Eyefinity mode, but the fourth display must be a DisplayPort monitor or connected with an active adapter."

I've heard AMD's launch date for this is today; Guru3D has the following to say:

"But I need to add this little note alright; AMD's Never Settle Reloaded promotion continues. At participating retailers beginning 02 April, 2013, gamers will be able to receive a free copy of BioShock Infinite with a purchase of their new AMD Radeon HD 7790 graphics card. See, now that's great value. The Radeon HD 7790 series cards will be available in stores starting April 2, 2013"

I've run 2 flex edition cards, you idiot.Have you ?Run any MDT nVidia galaxy cards dummy ?How about all dvi outs so you daon't have to have $100's of dollars of soon to die amd dangly cables ? Heck a friend just got ANOTHER 6870 he usually runs 4 monitors, but that could only run 2 OOB and he has loads of cables, so he had to buy another cable just to run a 3rd monitor - it took 2 weeks to arrive...ROFLAMD SUCKS with eyefinity / multiple monitors and nVidia DOES NOT - nVidia keeps the end user in mind and makes it INEXPENSIVE to set up 3 or 4 monitors !

Amd makes it cost prohibitive.AMD SUCKS, they lost again, totally.Reply

What planet do you come from?This card will run 4 monitors, eyefinity has done this very well, forever. With discrete audio per monitor. Nvidia is really getting handed it's ass by AMD in this category.This card will spank it's nvidia competition in civ5, since civ uses opencl, and nvidia sucks at opencl (and their current cards even suck at cuda).Crossfire: there's a crossfire port at the top, genius. It will obviously crossfire.Too bad Nvidia's 2D quality and video quality is such utter shit. I might have actually used that gtx660 I bought instead of sending it back for a 7870.Reply

You obviously dont have a clear understanding of gpu tech so just stop blabbing your stupidity, even most nvidia biased people can admit Check linus tech tips , check your games they all work much better on amd with multimonitor.

and no overclock talk maybe because AMD doesnt approve of people tampering with gpus............ and because they want it to seem so good that you dont need an overclock...... and the specific hardware partners can make different port configs so why would you say that?and maybe comparing ASUS 650tis to this GPU is invalid becuase you didnt specify who made it so the port config advantage is completely irrelevant.

Amd is not bankrupt because of their GPU business. and their CPU business isnt bad i dont think that getting into the gpu and cpu of the top 3 consoles (PS4 Xbox WiiU) is so bad either. and why would game biases not be true if amds drivers and games play better on the amd based systemseg: Crysis 3. and saying that the civ 5 benches crashed is completely stupid because a good website like anandtech doesnt normally disregard such things. and AMD didnt pay them off if they are bankrupt right? yes it can crossfire because theres crossfire connectors on the top so maybe they assumed things would be implied for the general crowd.Reply

I think it's also worth mentioning that the 7850 is a quite excellent overclocker. At stock I think it's definitely not worth the extra $30, but once overclocking is taken into account, if you can afford the $30 you are crazy not to spend it (assuming you are comfortable with overclocking of course).Reply

Yeah, I'm curious what the pricing will look like on these a few weeks after introduction. I picked up a 7850 (2GB MSI Twin Frozr) for $170 AR a few weeks ago to put in a HTPC, and I've seen it at that price again already. It will be interesting to see if the regular sales on 7850s decrease once the 7790 is out. Kudos to AMD for offering BioShock Infinite with this.Reply

Given we're talking about gaming cards here, I think it's worthwhile to add that only the 7800s and 7900s come with AMD's Never Settle game promotion. So, if you're interested Tomb Raider and/or Bioshock Infinite, the 7850 may have significantly more value to you. If you're not interested in them, people have been selling the coupons on eBay for about $50-60 each.Reply

There's a small paragraph in the article explaining that this card _is_ part of the Never Settle Reloaded program. It's only getting BioShock Infinite since it sits at a lower price point, but still a nice addition. The bundles are a big part of the reason I'm curious to see how the pricing shakes out. I sold a TR/BS bundle and kept about ~$50 in my pocket after fees, so I basically got a very nice 2GB 7850 for $120. You could obviously sell the BioShock code you'd get with a 7790, but if the prices for that card stay at MSRP for too long they'll have some stiff competition from 7850s on sale. Unless of course the 7850 sales dry up since it doesn't have to cover such a large swath of AMD's lineup now price wise.Reply

Your chart shows Radeon HD 6870 FP64 performance as N/A. I think it's 1/20 of FP32 performance, but I'm not sure of that. It definitely can do FP64, as otherwise, it wouldn't be able to claim OpenGL 4 compliance.Reply

Yeah, I figure ~85 TBP/105w TDP because that would be smack between 7770/7850 as well as having 20% headroom (which also allows another product to have their TBP between there and 7850's max TDP with it's max tdp above it within 150w....ie ~120-125/150w). IIRC, 80w is the powertune max (TDP) of 7770, 130w for 7850. 85w is the stock operation (TBP) of 7790.

I really, really dislike how convoluted this power game has become...can you tell?!

First it was max power. Then it was nvidia stating typical power (so products were within pci-e spec) with AMD still quoting max, which made them look bad. Then we get this 'awesome' product segmentation with 7000 having TBP and max powertune TDPs to separate them, while nvidia quotes TBP and hides the fact the TDP limits for their products exist unless you deduce them from the percentage you can up the boost power.

AAAAaaaarrrrrghhhhh. I miss when the product you had could do what you wanted it to, ie before software voltage control and multiple states, as for products like this it gives the user less control and the companies a ton to create segmentation. Low-end stock products may have been less-than-stellar back in the day, but with determination you could get something out of it without some marketing stating it should fit x niche so give it y max tdp so it doesn't interfere with the market of z product.Reply

Maybe so you couldn't blow the crap out of it then return it for another one, then another one, as "you saved money" and caused everyone else to pay 25% more since you overclock freaks would blow them up, then LIE and get the freebie replacement, over and over again.

Maybe they got sick of dealing with scam artist liars... maybe they aren't evil but the end user IS.Reply

Initially I presumed this to be some "Optimus"-esque dynamic context switching power saving routine. However, the patent explicitly states, "This architecture lets a user run one or more GPUs in parallel, but only for the purpose of increasing performance, not to reduce power consumption."Which struck me as some kind of expansion on the nebulous "hybrid crossfire" tech that AMD has been playing w/since they birthed the 3000 series 780G igpu

Based off of AMD's previous endeavors in this area on the PC side I would be skeptical of the benefits/merit of pairing the comparatively anemic iGPU's of Kabini w/a presumably Bonaire derived GPU.As an aside; since SLI/CFX work by issuing frames to the next GPU available, if one GPU is substantially faster than the other(s), frames get finished out-of-order and the IGP/slower-GPU's tardy frames simply get dropped which may make the final rendered video stuttery/choppy.

Pairing an IGP with a disproportionately powerful discrete GPU simply does not work for realtime rendering.

It is certainly possible that with the static nature of the console and perhaps especially the unified nature of the GDDR5 memory pool/bank that performance gains could be had

oh well...my last point of arithmetic was simply that 1 fully enabled 4 core Kabini's I'm suspecting would have a 128 shader count igpu. Factor in the much ballyhooed 8-core Cpu in the PS4 we would have two Kabini's (128+128=256) + a Bonaire derived 896sp GPU all on some kind of custom MCM style packaging "semi-custom APU" (rumor had it that the majority of Sony's R&D contributions were in the stacking/packaging dept.)

Sony's previous two consoles (PS2 and PS3)have traditionally favored high frequency/bandwidth proprietary Interconnects between components (see Cell's EIB) so this is likely where the "secret sauce" Sony R&D came in, thus facilitating the 176GB/S.AMD was quoted (can't find link) that said Sony engineering would be excluded if/when they release a PC variant of said APU.Reply

Very, very interesting indeed. It tallies well with the numbers. There was me thinking they had bolted Pitcairn onto the side of their CPUs but this combo might make more sense (and yet also less sense).Reply

AMD always releases 1GB models and 2GB models so the amd fanboys can quote the 1GB model cheapo powercolor low end price, claim it wins price perf, then go on raging about how the 2GB model covers the high end ...

ROFL - That's what they do - they even do it when comparing to a 2GB nVidia, suddenly forgetting amd makes crapster 1GB they swore off years ago, even though that's the screamer amd fanboy price "they pay" because "it's such a deal! Man! "

Regarding noise measurement: weighing scale used for absolute measurements may be the "A" scale, but a card is not certain number of dB(A) louder than another card, it is certain number of dB louder (3 dB more would be twice as loud as far as sound pressure level is concerned, measured under same conditions and with same weighing), since your weighing scale used to take absolute measurements is the same.

This refers to all your statements along the lines of "... but over 3 dB(A) louder ..." etc.Reply

Middle of the road filler products are so boring. They are usually a mishmash of memory from here, GPU features from there, all so confusing and boring. Just release a full line of new core and memory features, age down your older products according to how they perform compared to the new and be done with it.Reply

Right now I believe there would not be. What people are anticipating is an inflation in the size of game assets spurred on by the next generation of consoles. Some people want to keep these things for a few years so it's a legitimate concern for a change!Reply

LOLOLOLOLWow how fanboys change from just prior releases with 2GB or 3GB amd crapcards - then it was an ABSOLUTE WIN according to you - necessary and "future proof!" especially for "skyrim mods !"LOLYou're a good laugh.Reply

Congratulations Cerise, between this article and the HTC One article you have succeeded in injecting so much pointless crap that it is no longer worth the effort to sift your crap out to read the actual comments on the article.Reply

It's good to see clock & voltage states become more fine-grained and their choice smarter. Ultimately how I'd like a GPU to work: set targets and limits for power use, temperature and noise.. and then crank it up as far as it goes. Vary chips by different amounts of execution units, not frequencies.

This includes simple user settings for lowering power consumption (call it the "green mode" of whatever), if people want to, which would automatically choose lower voltages to increase efficiency.

And of course something similar to nVidias frame rate target: if performance if fine now, save power. And save some thermal headroom in case it's needed soon. Make smart use of the power budget. It's nice to see AMD making some progress!Reply

Quote - "The Radeon HD 7790 runs at 1GHz, but is not going to be called a "GHz Edition" anymore. AMD feels that they have made the point about having 1GHz edition GPUs in the market in 2012, and did not feel a need to label this new one a GHz Edition. Therefore, it will just be known as Radeon HD 7790." - Brent @ HardOCP, Asus DCUII 7790 reviewReply

Don't see a point. As a PC gamer I want it all, not some laughably compromised card - just over 30FPS (if that) at 1080p with the settings turned up? What's the point, just buy a console. I'll stick with my 680.Reply

While I agree with your sentiment, this card was not designed with us in mind, there is a large portion of people on budgets, who can't go ahead and blow ~$450 on a graphics card. There is also a large portion of people who see no need to play games at Ultra with high AA, etc.

This card is a great line-up filler. I can see a use for this in a variety of budget gaming systems.Reply

Well with a name like Hardcore69, of course you would want the best of the best of the best (with honors). But I'm a casual PC gamer, so this card looks pretty great to me. My 4870 is getting a little (ok, very) long in the tooth. Why would I buy a console and pay full price for games, only use a controller, and have to pay a monthly fee (x360) just to play casually when I could pick up a $150 card and drop it into my computer?Reply

The day will come, in the not so distant future, that 700W PS requirements or higher for high-end gaming machines will come to an end (thankfully). And the whole system will not consume more than 100W and fits inside a mATX case or smaller (and no need for Godzilla size cooling fans anymore either).Reply

"pulling 7W more than the 7770, a hair more than the 5W difference in AMD’s TBP"That 5W is not at the wall though. Factoring in rounding PSU efficiencies, it's very possible that the cards are only drawing 5W more. :)"The Sapphire card, despite being overclocked, draws 6W less than our reference 7790."Seeing how the Sapphire runs cooler in Furmark, that might explain a Watt or two in reduced power draw, coupled with the efficiency of the PSU, it might explain three or four even. :)Reply

"NVIDIA has for a long time set the bar on efficiency, but with the 7790 it looks like AMD will finally edge out NVIDIA."

What is your definition of a long time? As far as efficiency standards, I consider AMD to be better for the end result when looking at the full definition and application of the word. See the spreadsheet I created here about 16 months ago to understand what I mean: http://forums.anandtech.com/showthread.php?t=21507...Reply

You just called Ryan a "dummy", did you, without even checking the statement further down which reads:

"For anyone looking to pick up a 7790 today, this is being launched ahead of actual product availability (likely to coincide with GDC 2013 next week). Cards will start showing up in the market on April 2nd, which is about a week and a half from now."

If YOU had read the article, blah blah dumb idiot blah blah. As you've not replied to anybody in particular, your mistargeted rants could be construed as being directed toward the staff themselves, so keep it up and you won't HAVE to worry about what AT is reviewing in future.

Bottom line - it's faster than the 650 Ti, it's looking to be more efficient than the 650 Ti, and oh look, both have 1GB of GDDR5 on a 128-bit memory interface, which you seem to have forgotten when you leapt down AMD's throat about the 7790, and when you went on your childish tirade about the 5770's 128-bit memory interface earlier.

As far as I recall, Ryan didn't mention anything about when Titan was available to buy, only that it had launched. Pretty much blows your theory of Ryan hating NVIDIA out of the water, doesn't it?

I'm not sure if I've said this before, and apologies to everybody else if I have, but I'm done with you, full stop. I can only hope everybody else here decides that not feeding the ignorance you perpetuate on every single AMD article would save them time they could be devoting to something far less boring instead.

To the staff - is there anything you can do to introduce an Ignore List? Thanks in advance for your response.Reply

You got eveything wrong again, and you failed to read the article not I, and you failed to read my reply addressing half your idiotic non points, so you're the non reader, fool. Now I have to correct you multiple times. And you're a waste.650TI overclocks and it's only faster in a few amd favor games which are here, of course.Strike one for tardboy.650Ti runs fine OC'd too, which it does well: " We pushed Gigabyte's GeForce GTX 650 as far as it'd go and achieved a maximum core overclock of 1125 MHz, with the GDDR5 memory operating at 1600. All it took was a 1.15 V GPU voltage. "http://www.tomshardware.com/reviews/geforce-gtx-65...The 128 bit bus - REPAYMENT for you FOOLS SQUEALING prior, what's so hard to understand ?Did you forget all your WHINING ?Did you forget your backing up the FAILED theorists with the nVidia dual speed memory ?ROFLYou're up to strike 4 already." Ryan didn't mention anything about when Titan was available to buy, only that it had launched. Pretty much blows your theory of Ryan hating NVIDIA out of the water, doesn't it?"NO, so why would it be mentioned if he didn't want anyone to buy it ? Why mention it, that would key in to save for release date, right ?Instead we get this gem first off in BOLD to start the article: " Who’s Titan For, Anyhow? "

To Ryan and staffAs a long-time admirer of AnandTech, I always enjoy reading pretty much every article you post, and have immense respect for all your writers.However, I am now utterly fed up with the direction the comment discussions have taken. The general pattern is they start out as debates and end up as pretty nasty personal attacks that have nothing to do with the articles. You may say 'don't read the comments', to which I reply that they used to be an extension of the articles themselves, and were always a source of valuable information.It pains me to say this, but if you don't start removing the trolls I will no longer come to this site at all, and I would guess I am not alone in having this opinion.Reply

Well said, and thanks. I no longer visit Dailytech for the same reasons. I enjoy reading comments, since they can offer other perspectives from like-minded people, but unmoderated is worse than nothing at all. This used to be my favorite tech site, but the comments section here has slowly been pushing me to avoid it most of the time.Reply

The 7790 reminds me of the 4770. Sure, that was on a new process node, but it's a late addition to the line designed to take advantage of tweaks, process improvements, etc.

There may be a lot of transistors in a GCN design but I couldn't help feel that there were power savings to be had. For this reason, I'd hope that their next flagship doesn't exceed the 7970GE's power draw whilst providing a decent performance boost.Reply

For those who want the most power in the smallest package and power drain, look no further then the Radeon 7790. The only disappointment was the heat factor, but more or less the same performance as 7850 at half the power; that's great. Also, I don't mind that AMD went the 6 ghz vram route, because now there is even more reason to get 2 GB which is especially needed if you apply a dozens or hundreds of mods to your games. Also its the 128 bit interface that kept the power low, so despite everyone's cussing AMD made the right choices. I have a GTX 460 which easily uses at least 200 watts. This 7790 is almost twice as fast and uses 2.5 times less power. The pricing is acceptable, if you were to include 2GB by default, then why bother with the 7850; they still want ppl to buy that one.Reply

Hey Anand, can you guys please do a video quality test? I mean I haven't seen any such test on any website for over 3 years. So please, can you do a video quality test in movies and games and please also use low quality video as well, not just top of the line 1080p type videos that would look amazing even on a GeForce 3. Reply