Answered by:

Windows Store App and 'buggy' Direct3D 11 video cards (Intel GPU)

Question

I have now a couple of apps using Direct3D 11 and now I'm starting to get some '1-star reviews about issue that is for me, related to display drivers.

Some users had some issues like: launching the app and got a black screen, or launching the app and have serious graphics issue like objects not rendering

All those apps passed Windows Store certification, were tested on graphics card like AMD Radeon 6800 and tested on Microsoft Surface tablet (and also Windows Phone 8), and also with no warning or error when running in D3D 11 Debug mode.

For me, those users are running a Direct3D 11 graphics adapter (usually Intel GPU) 'claiming' to be Direct3D 10 capable (returning a D3D Feature Level 10), but in fact, have no enough memory to create a Direct3D 10 context, and are unable to start
the application.

How to approach this problem ? Is it possible to have for example some user settings for the app in the charm bar for forcing to Direct3D Feature Level 9 ? Contact the vendors ?

I'm looking for options.

EDIT: Apparently one good option is to not define D3D_FEATURE_LEVEL_10_1 / 10_0 to filter the 'legacy' GPUs:

All replies

Are you sure though that the users with 1 star are not actually having feature level 9_1 ? All Windows 8 tablets have just feature level 9_1 which is very restrictive and btw you need binary shaders to be built specifically for the device feature set, so
if you plan to support all of them you need shaders for feature set 9_1/9_3/10_0/10_1/11_0 OR just include 9_1 if you don't have complex shaders.

You can also just test out feature level 9_1 on your 11_1 device by only leaving the featureLevels array with D3D_FEATURE_LEVEL_9_1. If you get a feature level 11_1 device you can't get 9_1 debug errors :).

No, the users had an Intel HD Graphics, which is Direct3D 10. We've tested on Microsoft surface device which is Direct3D 9.1 profile.

We've just rolled out an update to exclude Direct3D 10.0 and 10.1, and stick to 11.1, 11.0, 9.3 and 9.1. The problem is that when we've detect Direct3D 10 or better, our app enable more extensive data (high resolution textures, more complex shaders, post
effects..), and it takes some memory and those old GPU cannot 'stand the heat'