It could be potentially linked to console cycles. DX10 came out at the end of 2006 so PS3 and 360 hardware was set. Any game that was going to span the consoles and PC would have to be based on features all 3 could do. PC would eventually get DX11 support but then DX9 would be required for backwards compatibility with older hardware that makes up the majority of the user base. Perhaps?

It has bugger all to do with consoles and more to do with windows operating system attach rates.
Direct X 10 was only in Vista and above and the time it released the vast majority used XP so to make games using DX10 would only target a very small percentage of gamers.

DX 12!? Time for newer new GPUs.... if you're not on the XBone..... Oh wait no ones gonna use DX12 til PS6 and Xbox One3 are out.

That's true. We still don't have tessellation on games on PS4 and X1. Any new features on DX11,12 will be on OpenGL, but it will require much optimization to even free up resource.

For X1, it has to optimize to 1080P first and optimize even more for extra resources for any new features. I find it unlikely. Unless new feature replaces old feature and does it much cheaper and efficient. Then, it would be win for both consoles.

It takes more power to run. A game at 900p than it does to run at 1080p. I don't understand what's up with this 1080p debacle.

Sent from my SAMSUNG-SGH-I337 using Tapatalk

"We don't provide the 'easy to program for' console that (developers) want, because 'easy to program for' means that anybody will be able to take advantage of pretty much what the hardware can do, so then the question is, what do you do for the rest of the nine-and-a-half years?"
--Kaz Hirai, CEO, Sony Computer Entertainment

It takes more power to run. A game at 900p than it does to run at 1080p. I don't understand what's up with this 1080p debacle.

Sent from my SAMSUNG-SGH-I337 using Tapatalk

I´m sure you are joking but if not.... What ?!
If you are seriuos, by that logic, just imagine how powerful the PS1 must have been. It renders in what, 480p, you should get one and call it "newest-next-gen".

who knows, but if xb1 can do dx12 that should mean the gpu is not what we thought. That's my guess.

Preparing some hot crow soup?
Are we back at the "secret sauce" level? I'm certain if the gpu was not what "we" (more like you) thought it was, someone would have rubbed the wizard and expose the $#@! out of it by now.

After 4 months out on the market, it's nice to see some fans keep the fairytale alive! Keep on fighting the good fight!

Preparing some hot crow soup?
Are we back at the "secret sauce" level? I'm certain if the gpu was not what "we" (more like you) thought it was, someone would have rubbed the wizard and expose the $#@! out of it by now.

After 4 months out on the market, it's nice to see some fans keep the fairytale alive! Keep on fighting the good fight!

Sent from my C6506 using Tapatalk

I wouldn't call that secret sauce. Its not a surprise that dx12 was coming, but i don't think people knew that xb1 would have it. If the rumors of it being in the xb1 are true, that would explain that 3 billion dollar deal they have with AMD. I thought the GPU in xb1 was only dx11.1 or something like that. Its just my thoughts on it. I could be wrong.

DX 12 doesn't make up for the ESram thing. Thats still that much more specialized coding/optimization that has to be done. At the same rate however that's not saying it won't help. How much though is to be seen. For MS' sake it better make up for not being as close to the metal as a closed box should be from conception, let alone out the gate

Direct X11 comes out with tessellation? OpenGL comes back right out with it.

GPU cores on Xbox and PS4 are exactly identical. PS4 just has more of them. So needless to say, oh oh secret sauce was made into Xbox One GPU to support DX12 is just grasping at straws.

Seriously Xbox folks need to stop crying out for secret sauce. It really is getting old and just makes them even more desperate.

Originally Posted by D3seeker

DX 12 doesn't make up for the ESram thing. Thats still that much more specialized coding/optimization that has to be done. At the same rate however that's not saying it won't help. How much though is to be seen. For MS' sake it better make up for not being as close to the metal as a closed box should be from conception, let alone out the gate

From the GSIV Wiz

Xbox One was codable to "metal" from the beginning. This generic API will not increase optimization level better than "coding to metal".

Preparing some hot crow soup? Are we back at the "secret sauce" level? I'm certain if the gpu was not what "we" (more like you) thought it was, someone would have rubbed the wizard and expose the $#@! out of it by now. After 4 months out on the market, it's nice to see some fans keep the fairytale alive! Keep on fighting the good fight! Sent from my C6506 using Tapatalk

Agreed. And when this DX12 wizard $#@! fails to save the day just like the ESRAM, the cloud and the magical drivers failed to save the day before it what will be the next Great White Hope? lol

Originally Posted by snooper71Preparing some hot crow soup? Are we back at the "secret sauce" level? I'm certain if the gpu was not what "we" (more like you) thought it was, someone would have rubbed the wizard and expose the $#@! out of it by now. After 4 months out on the market, it's nice to see some fans keep the fairytale alive! Keep on fighting the good fight! Sent from my C6506 using Tapatalk

Not really. Its possible that there may be things about the hardware that aren't known. I think MS did something similar with xbox360.

Posting Permissions

PlayStation Universe

Copyright 2006-2014 7578768 Canada Inc. All Right Reserved.

Reproduction in whole or in part in any form or medium without express written
permission of Abstract Holdings International Ltd. prohibited.Use of this site is governed
by our Terms of Use and Privacy Policy.