Any general suggestions for D3D11 compatibility bits? I can find a lot of info for D3D9 titles, but I'm not seeing a lot for D3D11.

At present, I am fixing various issues in .hack//G.U. and because of the way its render targets work, normal driver MSAA isn't happy. Probably the biggest problem I'm facing here is that the game continuously switches between BGRA and RGBA color rendertargets and I cannot simply blit the output of one as the input to the other in order to do MSAA resolve. I'm hoping like hell there's a driver hack to allow that.

I'm trying to avoid hacking their engine to bits and making all render targets multi-sampled and doing manual resolve, but unless I can find a good resource for D3D11 AA compat. bits, hat may be my only choice.

To me that proves the validity of at least 2-way SLI with a modern PCIe 3.0 x16/x16 system, and its potential to remain with us for a long time to come.
But honestly the fact that most of the more reputable GPU reviewers and miscellaneous tech sites don't even bother to test with more than PCIe 3.0 x8/x8 and then call it quits is just ridiculous.
The amount of misinformation and pure BS on some sites and forums about SLI is getting completely out of hand.

I personally still have hope for improved SLI support in 2018, but it all hinges on better corporation between NVIDIA and game developers, combined with more competent and honest information to the "general public" from GPU reviewers and similar.
Shifting responsibility and playing the blame game won't help anyone.

Anyway with that in mind, I wish all of you guys on the Guru3D forums a happy and fruitful New Year

To me that proves the validity of at least 2-way SLI with a modern PCIe 3.0 x16/x16 system, and its potential to remain with us for a long time to come.
But honestly the fact that most of the more reputable GPU reviewers and miscellaneous tech sites don't even bother to test with more than PCIe 3.0 x8/x8 and then call it quits is just ridiculous.
The amount of misinformation and pure BS on some sites and forums about SLI is getting completely out of hand.

I personally still have hope for improved SLI support in 2018, but it all hinges on better corporation between NVIDIA and game developers, combined with more competent and honest information to the "general public" from GPU reviewers and similar.
Shifting responsibility and playing the blame game won't help anyone.

Anyway with that in mind, I wish all of you guys on the Guru3D forums a happy and fruitful New Year

Click to expand...

^^ This

It is a bit strange that reaction to SLI recently. Inspector is easy, 90% of games work fine, and good luck running 4K anything or using VR with any level of details without it. That the response is often 'just buy a single card and tone down the details' is the absolute antithesis of a) striving for the best gaming experience for your enjoyment and b) PC gaming full stop. PCMR people. Why bother playing a game half-assed? Go all in - max res, max details, max fps. SLI lets you do that. Anything less is a waste of time.

Also thanks for all your work here and on 3d-center (I browse that thread sometimes with Google translate!).

GuruKnight i can see from the 3dcenter masterthread, that people are aware of the issue with sli + g-sync in star wars battlefront 2... but as it is using the same engine build as battlefield 1, do you know of any "unknowns" that could possibly fix the issue?

GuruKnight i can see from the 3dcenter masterthread, that people are aware of the issue with sli + g-sync in star wars battlefront 2... but as it is using the same engine build as battlefield 1, do you know of any "unknowns" that could possibly fix the issue?

Click to expand...

There is nothing possible, but you can improve overall performance in SLI mode by disabling Temporal AA (change GstRender.AntiAliasingPost "1" to "0" in Profile-options)
However the performance-costs with SLI + GSync still remains and can not be bypassed.

Yeah and unfortunately you can't really accomplish anything useful with them either. Only a small handful of games work with it. I wonder if it just had a few extra functions added to bring it on par with DX9 if it would work.