Hitman to use DX12 and feature Async Compute

Hitman to use DX12 and feature Async Compute

Hitman to use DX12 and feature Async Compute

This year's Hitman game is confirmed to be compatible with DirectX 12 and will be making heavy use of Asynchronous shaders, and should be a great showing of what DirectX 12 can offer in terms of performance improvements over DirectX 11.

Right now AMD is the only GPU vendor that has support for Asynchronous shaders, allowing them to handle heavier workloads and make sore that no part of the GPU is idling between workloads when using DirectX 12 games that are compatible with Asynchronous compute. AMD says that Hitman has the best implementation of Asynchronous Compute yet, likely giving AMD a major advantage in this title.

Below is a statement from AMD regarding Hitman and its use of Asynchronous shaders.

AMD is once again partnering with IO Interactive to bring an incredible Hitman gaming experience to the PC. As the newest member to the AMD Gaming Evolved program, Hitman will feature top-flight effects and performance optimizations for PC gamers.

Hitman will leverage unique DX12 hardware found in only AMD Radeon GPUs—called asynchronous compute engines—to handle heavier workloads and better image quality without compromising performance. PC gamers may have heard of asynchronous compute already, and Hitman demonstrates the best implementation of this exciting technology yet. By unlocking performance in GPUs and processors that couldn’t be touched in DirectX 11, gamers can get new performance out of the hardware they already own.

AMD is also looking to provide an exceptional experience to PC gamers with high-end PCs, collaborating with IO Interactive to implement AMD Eyefinity and ultrawide support, plus super-sample anti-aliasing for the best possible AA quality.

This partnership is a journey three years in the making, which started with Hitman: Absolution in 2012, a top seller in Europe and widely critically acclaimed. PC technical reviewers lauded all the knobs and dials that pushed GPUs of the time to their limit. That was no accident. With on-staff game developers, source code and effects, the AMD Gaming Evolved program helps developers to bring the best out of a GPU. And now in 2016, Hitman gets the same PC-focused treatment with AMD and IO Interactive to ensure that the series’ newest title represents another great showcase for PC gaming!

Using Asynchronous shaders, and AMD's Asynchronous compute engines, AMD will likely get a significant performance boost from DirectX 12, which I guess will be in the region of 10%. Asynchronous shaders will also reduce the GPU Pipeline latency, again helping to increase performance.

"With async shaders, we can fill parts of the GPU that [would] otherwise be forced to idle. It's one of those features we wish we had on every GPU." - Dan Baker, Oxide Games

Right now Nvidia do not have a Asynchronous compute capability with their current lineup of graphics card, whereas AMD have had it since their GCN architecture was introduced in 2012.

Right now it is unknown if Nvidia have been able to add this capability into their next generation pascal silicon, or if we have to wait for another generation for Nvidia to catchup.

The system requirements for IO Interactive's upcoming Hitman title have been announced, revealing that the game requires 8GB of system memory a quad core CPU and a minimum of a GTX 660 or HD 7870.

This title is an episodic title, meaning that this game will be sold on a level by level basis, with the first episode releasing on March 11th and with the final episode being expected in holiday 2016 or early 2017.

On the GPU side the game requires a minimum of a GTX 660 or a HD 7870, which are both pretty even on the AMD and Nvidia sides, though as we move up to the games recommended requirements look very Nvidia friendly, placing a GTX 770 on par with a R9 290.

I'd rather people take notice than anything so they can see AMD aren't as bad as people make them out to be. Performance should be very good in this upcoming title

Indeed, I'd still have my Fury X if it didn't leak from one of the tubes, At the time no where had them in stock and the retailer said the wait time would be around 6 weeks hence I ended up getting the 980 Ti as I couldn't do without a GPU due to working from home.Quote

Indeed, I'd still have my Fury X if it didn't leak from one of the tubes, At the time no where had them in stock and the retailer said the wait time would be around 6 weeks hence I ended up getting the 980 Ti as I couldn't do without a GPU due to working from home.

Ya I don't blame you for not waiting. Personally I probably would have dropped down to 390x in Xfire or similarQuote

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date
on the latest technology reviews, competitions and goings-on at Overclock3D.
We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the
confirmation emails that will arrive in your e-mail shortly after to complete the registration.