Wonder why only the 1440P performance was updated for AMD cards (Vega 56 just got beaten in FHD by the 1070, but ultimately destroys it in 1440P, just like the 580 with the 1060). Vega64 is actually 10% from the 1080Ti in 4K. And it is an NV supported title. Absolutely terrific results for AMD (just like in Shadow of War).

After updating to the newest AMD drivers, i can't seem to get 75fps in the open areas i can see my GPU usage maxing out.
And thats when i notice framedrops, i don't think the previous driver did that, it was butter smooth all the way.

I noted that anomaly in the review. Not sure, ran it three times for the 480 and even reinstalled drivers - but got the same odd results over and over again. The results are what they are, not gonna sugar coat that by leaving it out of the chart.

I hate being the conspiracy theory guy here but is there some sort of planned obsolescence built into Nvidia cards or do they just not age well? These kind of results with the 1070ti on the cusp is highly suspect to me. I admit its been quite a while since I owned Nvidia cards. As soon as a new Nvidia card comes out the older cards relative performance begins to tank.

Edit: Also is this the air cooled Vega 64? If so this means the liquid is even closer to the 1080ti.

Ehm no, because I already tested the card you mention with this title and it's spot on where it needs to be. Secondly if Nvidia would do such a thing, it would show in all games. AMD found something clever that works for them, it's as simple as that.

And yes, this is tested with the air-cooled Vegas. AMD never shipped the liquid cooled edition for review. BTW it is a good question, I'll make some changes in future charts denoting clearly we use the air cooled versions.

Looks like some very suspicious results for AMD. 580 is twice as fast as the 480 with just a 5% OC. Not buying that in the least. The jumps from 1080 to 1440 are too wide. Something is going on at the driver level that is excluding 4xx cards from optimization’s or AMD is disabling something via driver on the newer cards.

Hilbert, are you able to test 1440p and 4k at Highest Settings but with DoF drops to High? Right now on Pascal at 1080p, the difference between Highest and High is 5-7% yet at higher resolution i.e 1440p and 4K it's up to 33% performance difference, very significant. I suspect the same thing is happening to the Rx 480.

I'd like to see comparisons in IQ between AMD and Nvidia. Call me old, but there was a time 15 years ago when nvidia's FX series was a disaster in dx9 and they did everything possible to "tweak, optimize, fix, help, improve", the frames in those games by doing some rather shady stuff. AMD then got caught doing the same thing 2 or three gens later. Ever since then, regardless of company, when I see two pieces of hardware that are usually close to one another and a big difference like this appears, I tend to assume it's cheap optimizations. Yes, that makes me sound negative, but if you did any reading into these companies, their drivers and the things they've done in the last decade even, you'd understand my assumptions.

Just seems too good to be true. And my grandpa taught me if it sounds too good to be true, it usually is.

I assumed these would be common results with AMD vs nvidia after AMD won the rights to BOTH big consoles. I assumed porting over through dx12 to AMD would be easier and probably make for more real optimizations. But this has not been the case.

Console DX12 is different to PC DX12 according to several major dev studios and big profile coders so I would not automatically assume having XBox and PS4 using GCN would help AMD and it's up to however the studio ports to PC which can be anything from basic to poor to outright disastrous or occasionally even good or almost excellent.

As for image quality and optimization I remember both AMD and Nvidia being caught "cheating" with very aggressive tweaks for some games early on, doubt some minor less visible effect could yield whatever uncorked the performance here though (Why not for ALL Polaris GPU's though, that's a bit weird.) and it's some other issue that were blocking things but who knows, more detailed analysis of the game code and drivers could help but without being able to attach much to the games process due to anti-cheat restrictions that might be difficult to discern outside of Blizzard directly.

5 - 10% performance can be something like a tweak or bug fix, more than that as here is usually something important though and outside of outright killing effects I doubt they could do anything that major by "optimizing" certain things though again it's hard to tell.
(Could be things outside of the players view like culling or edits to shaders or just making use of whatever hardware improvements the Polaris and Vega cards have to help boost things for these but again that should have carried over to the 400 series too if it was that.)

EDIT: Can't find it now but a ex-Nvidia driver engineer made some interesting points too for what went into the display driver and some workarounds for games breaking basic API usage and best practice recommendations from full shader replacements and just how much of the driver was compatibility fixes and such, they're pretty complex to put it mildly.
(And going by people such as Durante for DSFix and some other works or Kaldaien for SpecialK some games are a bit problematic to put it very mildly, all sorts of issues being discovered by poking around a bit.)

But that's a different issue, Destiny 2 does seem to be running pretty good on PC without being super demanding too.
(And newer drivers might improve this even further with some more work, ideally without murdering image quality in the process though I doubt that's anything that needs to be worried about.)

Console DX12 is different to PC DX12 according to several major dev studios and big profile coders so I would not automatically assume having XBox and PS4 using GCN would help AMD and it's up to however the studio ports to PC which can be anything from basic to poor to outright disastrous or occasionally even good or almost excellent.

As for image quality and optimization I remember both AMD and Nvidia being caught "cheating" with very aggressive tweaks for some games early on, doubt some minor less visible effect could yield whatever uncorked the performance here though (Why not for ALL Polaris GPU's though, that's a bit weird.) and it's some other issue that were blocking things but who knows, more detailed analysis of the game code and drivers could help but without being able to attach much to the games process due to anti-cheat restrictions that might be difficult to discern outside of Blizzard directly.

5 - 10% performance can be something like a tweak or bug fix, more than that as here is usually something important though and outside of outright killing effects I doubt they could do anything that major by "optimizing" certain things though again it's hard to tell.
(Could be things outside of the players view like culling or edits to shaders or just making use of whatever hardware improvements the Polaris and Vega cards have to help boost things for these but again that should have carried over to the 400 series too if it was that.)

EDIT: Can't find it now but a ex-Nvidia driver engineer made some interesting points too for what went into the display driver and some workarounds for games breaking basic API usage and best practice recommendations from full shader replacements and just how much of the driver was compatibility fixes and such, they're pretty complex to put it mildly.
(And going by people such as Durante for DSFix and some other works or Kaldaien for SpecialK some games are a bit problematic to put it very mildly, all sorts of issues being discovered by poking around a bit.)

But that's a different issue, Destiny 2 does seem to be running pretty good on PC without being super demanding too.
(And newer drivers might improve this even further with some more work, ideally without murdering image quality in the process though I doubt that's anything that needs to be worried about.)

Click to expand...

Thanks for the post. I was not aware console and PC DX12 was different. Kind of defeats the purpose of it doesn't it?

As for the cheats. I'm not flat out saying AMD is cheating. The problem is, both have cheated in the past, so it's logical to me to be hesitant when something out of the ordinary pops out like this. But, by no means am I saying it's certainly a cheat. Maybe it's legit, clean, just plain better dev work by AMD's driver team. Maybe they had a better relationship with Bungie working on the PC version.

I ask a lot of questions. It's in my nature and I'm an annoyance in that way (many more as well, I'm sure). So I just wonder, is all. Maybe in future reviews we can get a page for IQ? I swear Hilbert used to do that?

Not sure why a large amount of the general populace on reddit are fanboying over how well the game is optimized.
To me this game looks like it relies on heavy amounts of post-processing in order to look good. If this is indeed the case, the performance results shouldn't surprising at all (aside from those few GPU showing anomalies).

Not saying the game is a sh!t port or that it looks bad. It looks pretty and it's what a port should be. Perhaps people have grown too used to crap ports.

I'm wondering what the massive difference is in this game between the RX400 and the RX500 series. Like the RX480 scoring 33FPS at 1400, but the RX570 scoring 50 - that's a MASSIVE difference between those two.