Demystifying DirectX 12 support in Windows 10: What AMD, Intel, and Nvidia do and don’t deliver

This site may earn affiliate commissions from the links on this page. Terms of use.

Now that Windows 10 is finally shipping, the question of DirectX compatibility is going to move from a marketing bullet point to a tangible issue for users. For more than a year, AMD and Nvidia have been advertising that various older GPU families would support DirectX 12 at launch. Recently, however, there’s been some confusion over what level of support Intel, AMD, and Nvidia will offer for the new API and which products will run the upcoming games that rely on it. The current confusion seems to have been caused by comments from AMD’s Robert Hallock, who acknowledged that the various AMD GCN-class GPUs support different feature levels of DirectX 12. This has been spun into allegations that AMD doesn’t support “full” DirectX 12. In reality, Intel, Nvidia, and AMD all support DirectX 12 at various feature levels, and no GPU on the market today supports every single optional DirectX 12 capability.

DirectX feature levels and point updates are not the same thing

The first thing to understand is that DirectX feature levels aren’t the same thing as DirectX point updates. A point update (DirectX 10.1, DirectX 11.1 / 11.2) is an additional set of standardized capabilities that allow developers to perform certain tasks more efficiently or offer specific capabilities. DirectX 10.1, for example, implemented new standards for visual quality, new shader instructions, and support for cube map arrays. It wasn’t a significant enough update to define an entirely new version of DirectX around it, but it was a large enough step to warrant its own extension.

A DirectX feature level, in contrast, defines the level of support a GPU gives while still supporting the underlying specification. This capability was first introduced in DirectX 11. Microsoft defines a feature level as “a well defined set of GPU functionality. For instance, the 9_1 feature level implements the functionality that was implemented in Microsoft Direct3D 9, which exposes the capabilities of shader models ps_2_x and vs_2_x, while the 11_0 feature level implements the functionality that was implemented in Direct3D 11.”

The chart below is a partial example of DirectX 11 feature levels, just to illustrate the point:

The purpose of the feature level function is to allow developers to target a single API rather than developing separate code bases in parallel to ensure that a game can run smoothly on multiple generations of GPU hardware rather than writing separate code for DX12, DX11, DX10, and so on. A DirectX 11 GPU with feature level 9_3 couldn’t magically perform DirectX 11 effects, but it could run games in DirectX 9 mode without the developer needing to write an entirely separate engine implementation to allow for it. That’s how games like Civilization V were able to run in either DX11 or DX9 modes from a common code base.

AMD, Intel, and Nvidia: Who supports what?

One of the problems with identifying which GPUs support which features is the confusion between DirectX API support and feature level support. This support page from Nvidia, for example, details how Fermi and Kepler GPUs can support DirectX 11.1 at feature level 11_0. The reason Kepler and Fermi don’t support DirectX 11.1 at feature level 11_1 is because two of the capabilities required for 11_1 aren’t available on the GPU. Nvidia goes to some pains to point out that the 11.1 DirectX update actually adds support for some capabilities Fermi introduced in 2010.

The following Microsoft slide details exactly which DirectX 12 feature levels are supported by which hardware iterations:

It’s not clear why Microsoft lists Kepler as supporting DirectX 11_1 while Nvidia shows it as limited to DirectX 11_0 below, but either way, the point is made: DirectX 12 support is nuanced and varies between various card families from every manufacturer. AMD’s GCN 1.0 chips include Cape Verde, Pitcairn, and Tahiti and support feature level 11_1, whereas Bonaire, Hawaii, Tonga, (possibly Oland) and Fiji will all support feature level 12_0. Nvidia’s various 4xx, 5xx, 6xx, and 7xx families will all support DirectX 12 at the 11_0 or 11_1 feature level, with the GTX 750 Ti offering FL 12_0 support. Note that Oland is an odd sort of hybrid chip — it may have DirectX 12_0 feature support, but it lacks features like TrueAudio and possibly XDMA support.

The issue has been further confused by claims that Maxwell is the only GPU on the market to support “full” DirectX 12. While it’s true that Maxwell is the only GPU that supports DirectX 12_1, AMD is the only company offering full Tier 3 resource binding and asynchronous shaders for simultaneous graphics and compute. That doesn’t mean AMD or Nvidia is lying — it means that certain features and capabilities of various cards are imperfectly captured by feature levels and that calling one GPU or another “full” DX12 misses this distinction. Intel, for example, offers ROV at the 11_1 feature level — something neither AMD nor Nvidia can match.

Why DirectX 12 looks the way it does

One common concern from gamers is that if their cards only support DirectX 12 11_1 or 12_0, they’ll miss out on what DirectX 12 has to offer. It’s important to remember that the multi-threading and multi-GPU capabilities of DirectX 12 that we’ve seen previewed to-date (and demonstrated via Mantle) are still completely available to every feature level. Kepler and older GCN GPUs will absolutely benefit from the new capabilities DirectX 12 delivers. With that said, there are some specific capabilities baked into DirectX 12_0 and 12_1 that gamers with older cards won’t have access to — but as the charts above show, this isn’t a problem unique to AMD, Nvidia, or Intel. No current Intel IGP supports DirectX 12_0, while only Nvidia’s Maxwell hardware supports 12_0 or 12_1.

This recent slide from the GTX 980 Ti launch implies Kepler is limited to feature level 11_0, not 11_1.

To understand why Microsoft built DirectX 12 the way it did, consider the alternative. Prior to DirectX 11, every new DirectX version was tied to new hardware requirements. From time to time, AMD or Nvidia might implement a specific feature in hardware before it became part of a future DirectX standard, but graphics cards were fixed to the DirectX APIs they supported at launch. Without the flexibility afforded by feature levels, the only gamers who could take advantage of DX12 would be those who purchased either a GCN 1.1, 1.2, or Maxwell GPU. Everyone else, including the millions of people with slightly older cards, would’ve been left out in the cold.

Adding feature levels and implementing them as part of DX12 means that millions of people will see significant benefits from adopting the new API in the here and now. No, older GPUs may not support every single DX12 feature, but no one is going to end up having to choose between a game that looks great in DX11 or a half-assed DX12 version due to graphics card implementation issues. When AMD, Nvidia, and Intel talk about supporting DirectX 12 on older hardware, they’re talking about the features that matter most — lower-overhead APIs, better CPU utilization, and multi-GPU functionality. The actual feature levels that define 12_1 as being different from 11_0 are interesting and useful in certain scenarios, but they aren’t the capabilities that will truly shape how gamers experience gaming with the API.

Just as there are very few games that require DirectX 11.2 or 11.1 (offhand, I can’t think of any), there are going to be very few DirectX 12 titles that mandate DirectX 12 FL 12_0 or 12_1. I’m not saying such games will never happen, but that’s going to be years from now, long after current GPUs have been replaced by modern hardware. If you own a GCN 1.0, Fermi, or Kepler card, you’re going to get the DirectX 12 features that matter most. That’s why Microsoft created feature levels that older GPUs could use — if Fermi, Kepler, and older GCN 1.0 cards couldn’t benefit from the core advantages of DirectX 12, Microsoft wouldn’t have qualified them to use it in the first place. The API was purposefully designed to allow for backwards compatibility in order to ensure developers would be willing to target it.

Tagged In

This site may earn affiliate commissions from the links on this page. Terms of use.

ExtremeTech Newsletter

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.

Email

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our
Terms of Use and
Privacy Policy. You may unsubscribe from the newsletter at any time.