I did mention OpenGL,but the main focus for that API is professional graphics environments first and foremost....Some early developers supported OpenGL in their first games for it's multiplatform potential(Mac's,linux),and also because way back then,direct X sucked hard,as even direct X5 didn't even support multitexturing in a single pass....We had to wait for Direct X6 for that one.

Today though,it's actually the direct X11 standard that's in the lead for graphics features support compared to the OpenGL standard,so the latter really has it's multiplatform advantage as it's selling point and still allows the use of custom graphics features using custom OpenGL driver extensions,something that Direct X stopped once DX10 rolled in.

The examples i quoted were relating to features that's not part of the direct X standards in the first place,and it's where the emphasis is being placed by both companies to make their hardware look special in end users eyes.....The graphics effects capabilities are the same for both,as they have follow the DX11 specification to the letter there,so there's no way to make either card look "special" in that area.

That's my main point.

While today, yes OGL is mostly professional, had they kept pace with MS and DX, more Devs would still be using it. JC still does and is the only one who still supports OGL standard.

And while DX is a standard, alot of what is in DX today is because of the design efforts of Nvidia and ATI. Both still come out with features and ideas not in DX that eventually end up being adopted/included into furture DX specs.

GPU based physics, expanded resolutions support, 3d support are all things that are being pushed by ATI and Nvidia and could very well end up in an update to DX11 or be apart of DX12. The point being, without these extra features they work hard to bring to us gamers, they would never get added to any APIs standard.