Speaking on the hardware enthusiast forums, Overclock.net, a developer from Oxide Games had some interesting information to share regarding the utilization of Async Compute in modern games. The studio happens to be the first to produce a game that supports DirectX 12 from the ground up.

According to the developer, Oxide Games’ use of Async Compute in the PC and DX12 exclusive Ashes of the Singularity pales in comparisons to some of the things that PS4 and Xbox One developers are doing with the technology. He went on to say that the console developers taking advantage of Async Compute are yielding as much as 30% additional GPU performance in their games.

Our use of Async Compute, however, pales with comparisons to some of the things which the console guys are starting to do. Most of those haven’t made their way to the PC yet, but I’ve heard of developers getting 30% GPU performance by using Async Compute.

He further added that Nvidia’s Maxwell GPUs don’t offer native support for Async Compute, and that things could get pretty disruptive in a year when graphics engines built around and optimized for AMD’s GCN architecture start making their way to the PC.

Too early to tell, of course, but it could end being pretty disruptive in a year or so as these GCN built and optimized engines start coming to the PC.

AFAIK, Maxwell doesn’t support Async Compute, at least not natively. We disabled it at the request of Nvidia, as it was much slower to try to use it then to not.

Weather or not Async Compute is better or not is subjective, but it definitely does buy some performance on AMD’s hardware. Whether it is the right architectural decision for Maxwell, or is even relevant to it’s scheduler is hard to say.

Nvidia’s PR has previously put the blame for Ashes of the Singularity’s less than stellar performance on Microsoft’s latest graphic API i.e. DX12, on Oxides Games. Though, the developer assures that there is no dispute between Oxide Games and Nvidia. He believes that the initial confusion between the two was due to Nvidia’s demand that the studio disable certain settings in its benchmark, which it declined.

Let us know what you think about this information in the comments below.