Wow, those specs are weird. 44 ROPs? No mention of geometry units. And did that chart say 800MHz base with boost up to 1,020? 512-bit memory interface?! REALLY AMD?!! Waste of die space.

Trades blows with Titan and some cases loses to GTX 780, so I don't think it's safe to say this the "fastest" especially since a fully enabled Titan Ultra is rumored to be right around the corner.

Then again, as long as it's priced ~600 as initially speculated, it should still be a better value than Titan... but GTX 770 and HD 7970 make more sense if one isn't rich and can't afford top dollar hardware.

Also important to note that these results (if legit, which seems fairly likely) won't factor in the usual improvements that come with mature drivers. The Kepler and Tahiti cards that this was tested against already have most of their performance improvements in place. This card is likely _more_ impressive than those results make them look.

Considering how long AMD's had to work on this refresh, it had better deliver some pretty serious horsepower

I expect 20-30% performance increase over 79XX. I might switch back to Radeons if Xbox One/PS4 ports prove to run better on AMD/ATI hardware. At the moment PhysX and driver support makes Nvidia more attractive option at least for me.

There have been a great many lies told about driver quality. Many of those lies have been spread by NVidia's team of compensated shills, not just by their army of loyal fanboys.

Truthfully, both AMD and NVidia have had problems with drivers. For example, recall that the largest cause of Vista blue-screens by far was NVidia's lousy drivers for the first year of that OS' introduction.

AMD's Graphics Core Next GPUs, especially Tahiti-based Radeon HD7970, HD7950 and HD7990 cards, have seen significant measurable improvements with driver updates this year. That's what you'd expect with a new architecture. The drivers for the newer chips have the low hanging fruit. Older-generation GPUs have already gotten most of the optimization that they're ever going to see.

The benchmarks definitely look impressive especially considering the price and more competition is always good for consumers. I just do not have faith in the quality of their drivers (based on my personal objective experience)... I might pick one up when it will be available, though, just to see how it really performs. If it will be bad - I'll just throw it into the closet (or give it away to someone) and continue using my Titan

P.S: I just noticed - they list Titan's video RAM as "3GB"... Is that a typo or something?

JustAnEngineer wrote:spread by NVidia's team of compensated shills, not just by their army of loyal fanboys.

Actually, it's ATI users themselves who did a huge disservice to ATI/AMD, particularly the users of Rage3D forum... You are probably too young or inexperienced to remember but there used to be a HUGE sticky thread in Rage3D's forums, in "Catalyst" section. It was titled (yes, I remember the exact title) "Current Catalyst driver issues *POST NEW DRIVER ISSUES HERE!!!*" and it's where all of ATI card users used to compile all of the issues (both officially known and "unknown") with various games/software, regardless of the current driver version. You did not have to be a "compensated shill" or a "rabid fanboy" to troll some ATI fanbot back then - all you had to do is to post the link to that thread after someone in some other forums would ask about ATI's driver quality or try to say that "their drivers are fine" Most of the issues were conveniently compiled on the first page of that thread. After some time the ATI got butthurt (especially when Nvidia started to use the link to that very thread in their interior PR documents) and asked the moderators to remove this sticky thread. So they did, in 2006 (or 2007, I don't remember when exactly). Now the only sticky threads that remain in that forum section are related to most current beta/WHQL drivers and these threads are also removed when new driver versions are being released.

Of course Nvidia's hired agency (Arbuthnot Entertainment Group) also used this thread for "Astroturfing" (forum shilling) purposes but the efficiency of their work significantly decreased when that thread was finally removed

Last edited by JohnC on Mon Sep 23, 2013 12:32 am, edited 1 time in total.

My subscription allows you people to exist on this site and makes me a better human being than you'll ever be

Airmantharp wrote:I just want them to fix the stuff Nvidia's gotten right for years, apparently. If they can't get 4k down, well.

Nvidia's never "gotten it right" for years, what they have gotten right is the PR battle which AMD lost a decade ago when they decided to include fixes in their release notes.

I don't really get it -- are you implying that Nvidia DOESN'T have better drivers than AMD, and that AMD's only failing is in terms of PR? Because this is objectively and provably false.

clone wrote:when TR mentions Nvidia called together several of us in the press last week, got us set up to use FCAT with 4K monitors, and pointed us toward some specific issues with their competition which then leads to TR releasing an article titled "Here's why the CrossFire Eyefinity/4K story matters". well AMD's going to have a hard time improving their reputation.

My concern with this card is that it's brute-forcing the lead with an expensive 512-bit design and physically huge GPU, rather than architectural or process improvements that increase performance/cost.

I'm not really concerned about the top SKU, that's for the 1% of people who are interested in performance, no matter the cost.I'm interested in the sweet spot where you get 80% of the performance for 50% of the cost, and that's unlikely to be cheap given how large these dies are, and how 512-bit PCBs are likely to be more expensive to produce than 256-bit or 384-bit.

Some people ask me why I have always enclosed my signature in spoiler tags; There is a good reason for that, but I can't elaborate without giving away the plot twist.

Chrispy_ wrote:My concern with this card is that it's brute-forcing the lead with an expensive 512-bit design and physically huge GPU, rather than architectural or process improvements that increase performance/cost.

I wouldn't call the design brute force. At around 430mm^2 it's 30% smaller than Nvidia's behemoth GK110 while offering similar performance (based on the rumors).

Chrispy_ wrote:My concern with this card is that it's brute-forcing the lead with an expensive 512-bit design and physically huge GPU, rather than architectural or process improvements that increase performance/cost.

I'm not really concerned about the top SKU, that's for the 1% of people who are interested in performance, no matter the cost.I'm interested in the sweet spot where you get 80% of the performance for 50% of the cost, and that's unlikely to be cheap given how large these dies are, and how 512-bit PCBs are likely to be more expensive to produce than 256-bit or 384-bit.

Chrispy_ wrote:My concern with this card is that it's brute-forcing the lead with an expensive 512-bit design and physically huge GPU, rather than architectural or process improvements that increase performance/cost.

I'm not really concerned about the top SKU, that's for the 1% of people who are interested in performance, no matter the cost.I'm interested in the sweet spot where you get 80% of the performance for 50% of the cost, and that's unlikely to be cheap given how large these dies are, and how 512-bit PCBs are likely to be more expensive to produce than 256-bit or 384-bit.

AMD targets multiple market segments; it's that simple. AMD can make a monster card, sell low volumes of it for (hopefully) high profit, stake claim to the worlds fastest GPU which is good for its reputation, and then still deliver products to the markets below. This also gives AMD some pricing relief, I suppose; AMD will be able to set the price at the top of the product stack rather than having to react to Nvida's pricing strategy.

My concern is not whether or not AMD wants to do a monster GPU. My concern is that AMD either does not have the talent, or the financial resources, or simply just the desire to develop its video card driver to the level that it needs to. The constant drip-drip of news that yet another feature of AMD's driver is not working properly is a big, big turn off.

Well leaked slides could all be fake but I guess there isn't long to find out. It looks like there could be some good performance to be had at a more competitive price but until the announcement who knows?

As far as Nvidia drivers being absolutely superior they've had a few flubs in their time as well, its not a bed of roses on the green team either (any 560ti owners want to chime in with comments about the last three driver release problems?).

Don't some of the crossfire issues in recent press point to something more fundamental architecturally than merely driver issues though? Or have I been reading it wrong? Presumably AMD are aware of the problem, what would it take to revamp crossfire at a hardware level, is there any reason they wouldn't?

Looking forward to the announcement though. I wonder if $599 gets you a copy of BF4 aswell...

I kind of wish they would bring a new feature set along to the table like HDMI 2.0 or game streaming to a tv dongle or something but that's probably wishful thinking. This is a genuine question but what makes these cards any more expensive to produce than a Kepler (If the leaks are true)?

CityEater wrote:Don't some of the crossfire issues in recent press point to something more fundamental architecturally than merely driver issues though? Or have I been reading it wrong? Presumably AMD are aware of the problem, what would it take to revamp crossfire at a hardware level, is there any reason they wouldn't?

Looking forward to the announcement though. I wonder if $599 gets you a copy of BF4 aswell...

I kind of wish they would bring a new feature set along to the table like HDMI 2.0 or game streaming to a tv dongle or something but that's probably wishful thinking. This is a genuine question but what makes these cards any more expensive to produce than a Kepler (If the leaks are true)?

To answer the second part, die size, 512-bit bus, etc. Essentially the bigger and more complex the die, the poorer the yields per wafer.

For the first part, we don't really know if there is a "fundamental" architectural hurdle in the way of crossfire and multi-GPU gaming. I think most of this is speculation based on the 4 megapixel frame limitation of the crossfire bridge. However, at least for the present, there are no single displays greater than 4 megapixels as all 4K displays are currently tiled (two displays in one), so at some point the image has to be split up regardless. It's outputting them to the display stream at the right time that seems to be the issue - which suggests a software timing solution. And if Nvidia can make it work reasonably well, then there's always hope for AMD...

Chrispy_ wrote:My concern with this card is that it's brute-forcing the lead with an expensive 512-bit design and physically huge GPU, rather than architectural or process improvements that increase performance/cost.

I'm not really concerned about the top SKU, that's for the 1% of people who are interested in performance, no matter the cost.I'm interested in the sweet spot where you get 80% of the performance for 50% of the cost, and that's unlikely to be cheap given how large these dies are, and how 512-bit PCBs are likely to be more expensive to produce than 256-bit or 384-bit.