Deus Ex: Mankind Divided DX12 Performance Review @ [H]

LLAPI is always much harder. Its never going to be easier. And the people you need on the team needs to be much better than regular. Top money, top crop. And then you have to add a lot more time to it as well, not to mention future support.

DX12 will never be cheaper, less time consuming or easier that DX11.

I also doubt in a neutral setting that it will be better than DX11 in performance. The only place DX12 will ever excel, should it ever happen, is when they truly do something DX11 cant. But we haven't seen any of this and we are not going to anytime soon. And by then we will have DX13 or whatever.

Even DICE cant make a good DX12. The reality is there.

Click to expand...

Dice results in BF1 is disappointing but not done yet. Once I get 1070 SLI I will do some testing, looks like BF1 is on sell for a rather great price now.

DX 12 will come of age when GPU power is dramatically increased where DX 11 will become the restriction. Start tripling, no even doubling the draw calls will kill DX 11. The more complex your scenes from objects, special shaders etc. the more restrictive DX 11 will become. At this time I would kinda agree with you since there is not a clear example showing that DX 12 can do something gaming wise above DX 11. Does that mean it can't - not at all. Plus you have to consider driver developement incurs a cost as well as more complex the gpu gets and how restrictive a given API can become blocking out hardware ability without a LLAPI. A LLAPI will allow access to new hardware capability much easier for example Async Compute (AMD method) with DX 12, not available with DX 11 - a 4%-7% gain in DX 12 for AMD in Gears Of War.

Even Microsoft says DX12 will never replace DX11. That's the entire case behind DX11.3. DX12 is for the sub 1% developers to begin with.

And every time you talk about the async gains. Remember the power increase. Not to mention the work behind needed is most likely not worth the 5% or whatever gain there is and requires fat sponsorships.

With DX12 you ask developers to do with job Nvidia, AMD or Intel does one time. They need to do it every time, including backwards for games when new GPUs comes out. At least if you exclude the constant reuse of old uarchs.

That is why Nvidia and AMD will need to support the developers more for these optimizations. I am sure Nvidia is very active in this (they have the money) not sure about AMD. Now it is not if developers where not making optimizations anyways with DX 11 and other API's because well they have been having different paths for different vendors at times. The problem comes when you have too many different enough hardware designs or platforms that need to be specifically programmed for. If AMD GCN arch from 1.1 and up are virtually the same from a programming standpoint then it should not take much effort there. Nvidia Maxwell and Pascal? So far it looks like Pascal can do DX 12 just fine but going back to Kepler and Fermi maybe wasted effort anyways for LLAPI's at this stage.

Would like to know what some of the developers think of DX 12 and Vulkan - everything I've heard seems to be more positive then negative.

Click to expand...

DX10 was also the perfect API, praised by developers when asked in public. Hated by developers when asked in private. And we all know the rest of the history.

You are pretty much saying that the entire success of the API depends on sponsorship money that often screw the result in favour of the sponsor. I remember when some people blindly thought it was as easy as a checkbox in the game engine. Oh have the times changed.

What happens when Volta comes in terms of older games and DX12? What happens if AMD ever moves on from the same reused GCN since 2012? Even between GCN versions with tiny changes it can go bad fast as we have seen with 1.2 and Mantle.

And every time you talk about the async gains. Remember the power increase. Not to mention the work behind needed is most likely not worth the 5% or whatever gain there is.

Click to expand...

Once a developer has the code that works, or game engine then that no longer becomes an on going time taking event. We have also been looking at averages and not scene or view points where it makes a bigger difference in % and experience. A 1% increase average could also mean in one area of a game a 20% increase in performance where it is now smooth vice jerky and other areas zero increase. Averages can be somewhat misleading if you don't consider what makes up that average.

Now other hardware features that Nvidia has can also be used if need be. DX 12 will allow Nvidia to expose those new capability much easier then with DX 11.

Once a developer has the code that works, or game engine then that no longer becomes an on going time taking event. We have also been looking at averages and not scene or view points where it makes a bigger difference in % and experience. A 1% increase average could also mean in one area of a game a 20% increase in performance where it is now smooth vice jerky and other areas zero increase. Averages can be somewhat misleading if you don't consider what makes up that average.

Now other hardware features that Nvidia has can also be used if need be. DX 12 will allow Nvidia to expose those new capability much easier then with DX 11.

Click to expand...

DX12 wont allow you to expose new features easier. DX is a defined standard of features. The features would have to be in a new DX12 version.

And yes, I have seen the DX12 stutter festival. Or how DX12 makes the game slower if heavy on the game logic. Huge success there!

DX 12 is a LLAPI where you have access if you want of hardware features. Even new ones, meaning allows Nvidia for example to really push new hardware features and supporting Software a.k.a GameWorks to get that access.

DX12 wont allow you to expose new features easier. DX is a defined standard of features. The features would have to be in a new DX12 version.

And yes, I have seen the DX12 stutter festival. Or how DX12 makes the game slower if heavy on the game logic. Huge success there!

Click to expand...

No it is not - feature level 11-0 and 11-1 is the minimum level for DX 12 hardware with 12-0 and 12-1 as optional hardware features. Many of the features in the given feature set can also be optional . In other words it will allow Nvidia and AMD to experiment with hardware features in the future and not tying it down as hard as previous DX versions.

DX 12 is a LLAPI where you have access if you want of hardware features. Even new ones, meaning allows Nvidia for example to really push new hardware features and supporting Software a.k.a GameWorks to get that access.

No it is not - feature level 11-0 and 11-1 is the minimum level for DX 12 hardware with 12-0 and 12-1 as optional hardware features. Many of the features in the given feature set can also be optional . In other words it will allow Nvidia and AMD to experiment with hardware features in the future and not tying it down as hard as previous DX versions.

Click to expand...

You cant add a feature in DX12 if its not supported by MS. Plain simple. DX12 changes nothing in that perspective compared to previous DX versions.

It is a bit more nuanced these days with GPUOpen from AMD, ironically all those defending AMD saying they are open standard and used GPUOpen as an example gave me facepalm moments.
Link here explaining the AMD GPU service (AGS), which is part of GPUOpen just like the Vulkan shader extension used by AMD.

Currently disabled for DX12 again but more to do with driver compatibility.http://gpuopen.com/gaming-product/amd-gpu-services-ags-library/
But that is much more limited compared to the Vulkan extensions, and just like the Vulkan extensions this is now getting back to the old days of both manufacturers upping the ante on proprietary coding-to-hardware development and performance.
Cheers

It is a bit more nuanced these days with GPUOpen from AMD, ironically all those defending AMD saying they are open standard and used GPUOpen as an example gave me facepalm moments.
Link here explaining the AMD GPU service (AGS), which is part of GPUOpen just like the Vulkan shader extension used by AMD.

Currently disabled for DX12 again but more to do with driver compatibility.http://gpuopen.com/gaming-product/amd-gpu-services-ags-library/
But that is much more limited compared to the Vulkan extensions, and just like the Vulkan extensions this is now getting back to the old days of both manufacturers upping the ante on proprietary coding-to-hardware development and performance.
Cheers

Click to expand...

You are intentionally being misleading here. What you posted isn't closed source as GameWorks is. No one will contest that GPUopen will benefit AMD GCN far better than any other architecture, but it still doesn't make it closed. And it being open means that any other Manufacturer can in fact make the changes and understand what the code does on their hardware. SO NOT THE SAME. And for the love of all that is holy stop being obtuse with your statements.

You are intentionally being misleading here. What you posted isn't closed source as GameWorks is. No one will contest that GPUopen will benefit AMD GCN far better than any other architecture, but it still doesn't make it closed. And it being open means that any other Manufacturer can in fact make the changes and understand what the code does on their hardware. SO NOT THE SAME. And for the love of all that is holy stop being obtuse with your statements.

Click to expand...

He didn't say anything about closed source, he is talking about proprietary coding, or extensions that are IHV specific. AMD thinks just the same way nV does when it comes to these things, they are not holier then thou. Please stop trying to say because they are open source they are better, that is BS, the only reason AMD is going open source is because they got their ass handed to them so many times and have no market pull, they had do it otherwise they would be no where.

If they were truly open source, they would not use any IHV specific extensions. That would make their libraries equal on platforms but that isn't the case, with shader intrinsic, we see the same thing, its a way to take advantage of their hardware.

While I feel there is nothing wrong with that, I do feel if you say that they don't do things for the benefit of their hardware, for their business from a programming level, is just wrong, they are doing it and will continue to do it any way its possible.

Yep spot on Razor,
unfortunately though context gets lost when it comes to defending AMD vs Nvidia and just how some think AMD's technology and solution is helping the industry, when in reality they are in same position as Nvidia and going for any advantage they can.
Open standard in my context as you rightly read means usable and can be implemented by other companies in a workable defined framework or part of a separate standard/committee, GPUOpen functions-service-workflow cannot ever be implemented on anything but GCN due to the proprietary nature of their hardware and proprietary low level hooks GPUOpen provides for said architecture.
And as you say in the context you describe it is not even open source as developers cannot collaborate (part of the definition for true open source along with capability to modify from its original design) due to the low level nature-architecture hook GPUOpen provides in a rigid structure.
AMD spread a lot of fud when they decided to attack Nvidia with their 'closed source is hurting us'.

Just to add, AMD with GPUOpen will up the ante again with what Nvidia does IMO, it means Nvidia will justify making some of their tools truly low level as well.
The benefit going forward for AMD is that they can 'sync' the tech between console and PCs to a certain extent, but I do think we may see a more aggressive tech approach as well now from Nvidia in response.
Not sure how I feel about this going forward, we have seen how such escalation of low level proprietary tech-solutions impacted gamers in the past; I cannot fault AMD making great use of their console footprint and agree it makes sense, but they have pushed this now to a point where Nvidia is off the leash.
Gameworks was more of a high level 'plug-in' effects/post processing tool and suite, but I can see this heavily changing going forward.

He didn't say anything about closed source, he is talking about proprietary coding, or extensions that are IHV specific. AMD thinks just the same way nV does when it comes to these things, they are not holier then thou. Please stop trying to say because they are open source they are better, that is BS, the only reason AMD is going open source is because they got their ass handed to them so many times and have no market pull, they had do it otherwise they would be no where.

If they were truly open source, they would not use any IHV specific extensions. That would make their libraries equal on platforms but that isn't the case, with shader intrinsic, we see the same thing, its a way to take advantage of their hardware.

While I feel there is nothing wrong with that, I do feel if you say that they don't do things for the benefit of their hardware, for their business from a programming level, is just wrong, they are doing it and will continue to do it any way its possible.

Click to expand...

This hypocrisy always amuses me, too. It's an open standard so long as you make hardware exactly how AMD demands.

I've finally gotten round to testing this game, I will post detailed performance data when I can but for now let me just say I was shocked to see the MASSIVE impact on performance the following settings have; Motion Blur, Chromatic Aberration, DoF. AMD CHS is also ridiculously demanding and it looks pretty bad, turning it down a notch from Ultra helped massively.

Slightly tempting to play devil's advocate and spin this as evidence of an AMD sponsored game gimping NV hardware, cause seriously, that's what would be happening if the roles were reversed. Overall I'm slightly pleasantly surprised by the graphical fidelity. I still believe the skin shaders to be hjdeous and the facial animations to be unnerving but hey! To each his own.

To tell the truth, I never understood purpose of motion blur if you have high fps. Motion blur was developed for 24fps type frame rates as in film (which has natural motion blur anyways) to smooth out the motion. If your motion is fluid your eye will have natural motion blur. DOF is a camera simulation and not an eye simulation unless you are near sighted. I can see DOF for video type scenes but for actual game play in a FPS it is more like uncorrected eye sight or someone needing glasses. Now if DOF had eye tracking knowing what you are looking at that would be more realistic. To me the game has better IQ and playability with motion blur and DOF off. The shadows I think sort of suck in this game, some objects don't cast shadows at all while being right next to other in the scene. AMD CHS sucks, at least in this game unless they have fixed it.

To tell the truth, I never understood purpose of motion blur if you have high fps. Motion blur was developed for 24fps type frame rates as in film (which has natural motion blur anyways) to smooth out the motion. If your motion is fluid your eye will have natural motion blur. DOF is a camera simulation and not an eye simulation unless you are near sighted. I can see DOF for video type scenes but for actual game play in a FPS it is more like uncorrected eye sight or someone needing glasses. Now if DOF had eye tracking knowing what you are looking at that would be more realistic. To me the game has better IQ and playability with motion blur and DOF off. The shadows I think sort of suck in this game, some objects don't cast shadows at all while being right next to other in the scene. AMD CHS sucks, at least in this game unless they have fixed it.

Click to expand...

I am also tired of developers trying to simulate cameras instead of eyes. Chromatic aberration is the worst of these effects.

I am also tired of developers trying to simulate cameras instead of eyes. Chromatic aberration is the worst of these effects.

Click to expand...

I dont even understand why chromatic aberration exists in game settings. Its essentially a camera flaw that degrades the picture quality. If high fidelity picture is what you are after it has a very much the opposite effect.

I dont even understand why chromatic aberration exists in game settings. Its essentially a camera flaw that degrades the picture quality. If high fidelity picture is what you are after it has a very much the opposite effect.

Click to expand...

I spend $600 mostly on digital lenses for my glasses to make everything sharp regardless of the distance or which way I am looking without skewing the world - then I pop in a game and add aberrations, blur etc. with a performance loss that is rather large? Not! I also wonder why this is even considered an IQ improvement myself.

Blur can disguise aliasing, poor textures, objects popping in existence and maybe other stuff but the cure to me is worst then the problem. Still having those options which maybe some will find useful is good.

We also present Pure Hair, an evolution of the well-known TressFX hair simulation and rendering tech, developed internally by Labs. Compared to the previous version, we have significantly improved rendering, employing PPLL (per-pixel linked list) as a translucency solution. We have also significantly enhanced simulation and utilized async compute for better workload distribution.

This game is really not improving that much in performance. In SLI DX 12 on the FX 9590, 2x 1070's, overall scaling sucks, in DX 11 SLI scaling is virtually zero for me. I was hoping to see some improvement in this game before I continue. Will probably complete game with the Nano rig with FreeSync monitor - gives the best game play in the end.

I just wanted to chime in with my finding playing DE MD in SLI with two 980 Tis. As an SLI gamer I like to maximize the utilization of both gpu's when playing a game and max out what ever fps I can get at maxed out ultra settings. At DX11 I noticed both gpu's max out at 95-99% utilization at 2560x1440 resolution and the game runs smooth at like 60 - 90 fps. However in DX12 I noticed about 55 - 75 % gpu utilization and the fps actually went down 40 % and the game played better on just one gpu with SLI off! I made sure I wasn't going over 6gb of vram usage aswell. So I thought the gpus' must be waiting for the cpu to compute the scene before it draws like I have read regarding DX12 and used Afterburner to monitor CPU usage on all cores. You would assume that cpu utilization must be maxed out if thats the case but what I saw was a mixed bag of numbers with nothing really maxed out at all. So it makes me wonder if Deus Exs' engine is muti threaded or not? My feeling is that either on the driver level or the game code still isn't optimized for SLI on DX12. Got to love DX11 though where I'm getting my money's worth.

Well I don't think that's true anymore in regards to AMD advantage considering the games been out for a while and drivers by now should have caught up. Fwiw the new 378.92 say they have added another sli profile for DEMD in the release notes. Got to test them tonight. Thanks.