It appears that the constant spree of bad news associated with the Xbox One simply won’t come to a halt. The recent news of Call of Duty: Ghosts and Battlefield 4 both operating at a native resolution of 720p on the console has come as a disappointment to many next-gen enthusiasts. Developers of multiplatform titles are currently experiencing difficulty in maintaining parity between the next-gen console versions of their games.

The Xbox One’s complex memory architecture is rumored to be the primary hurdle, and an improvement may be seen in the 2nd wave of titles released for the console. However, that may not be the case based on what we have heard from an Alpha tester for Bungie’s upcoming first-person shooter Destiny. The individual, who wishes to remain anonymous due to NDA restrictions, was kind enough to share a few things regarding the differences between the two next-gen console versions.

While the individual in question did say that he isn’t really an authority on the matter, to him it was “clearly evident” that, in its existing state, the Playstation 4 version looked crisper of the two. He believes the difference would be obvious to anyone who gains access to the forthcoming Beta program on both platforms, as it’s unlikely that either version would receive major improvements over the builds he last saw. It certainly wouldn’t surprise us if the Xbox One version is running at an internal resolution of 900p i.e. 1600 x 900 pixels. Given its relatively modest target frame rate of 30fps (based on the E3 showcase), it wouldn’t be too far-fetched to assume that the difference won’t be as drastic as the 720p/1080p situation with the Xbox One and PS4 versions of Call of Duty: Ghosts.

We look forward to giving our readers a hands-on with both versions of Destiny once the beta is underway some time early next year. Stay tuned for more on Bungie’s brand new first-person shooter.

After @xboxp3 interview I’ve been asking devs about impact of new Xbox sdk on perf. Bungie says it will get Destiny to 1080p/30fps on XB1.

justerthought

Of course Destiny will be sub par on XB1. That is the reason why Bungie broke away from MS. They are creating an open world game this time and know that XB1 has a problem running open world games. They were not going to restrict themselves to a platform that does not allow the game to run properly or they will loose sales.

To understand why XB1 has trouble running open world games, you have to look at RAM and how the GPU access’s it. Open world games have to stream data and world assets on the fly in real time wherever the player goes. That means in order to draw the frames of animation quickly, the GPU has to be extremely powerful and have rapid fast access to lots of fast RAM that can quickly fill itself with the required data.

PS4 has lots of fast GDDR5 RAM available to all data at all times, and a very powerful GPU to process it into frames. The perfect combination for open world games. Sony designed that architecture because they realised that open world games are the future of gaming.

XB1 cut cost to focus on kinect controlled multimedia by using slower DDR3 RAM that is not fast enough for the requirements of a GPU streaming open world data in real time. MS tried to compensate for this by adding a fast ESRAM cache to run a rapid small stream to tile data into the GPU. But the ESRAM cache can only hold 32mb at any given point in time and cannot help fill the slow DDR3 RAM any faster with volume data. Adding ESRAM and its move engines required that it sits on the APU chip with the CPU and GPU, so the GPU had to be trimmed in size, gimping it down to 12 compute units, whereas the PS4 has 18 compute units.

Hence PS4 has 50% more compute units, and when combined with the faster RAM, that power advantage goes straight to the games without much effort from the devs.

All this adds up to the XB1 having a bottleneck with the GPU waiting for data before it can render frames. That bottleneck rears its ugly head to the max with open world games. Devs are left with 4 choices in order to give the GPU less work. They can either reduce the pixel count (the size of each frame), reduce the number of frames (the frame rate), allow more pop-up with a short draw distance so that objects appear when the RAM is ready to deliver them, or introduce screen tear by allowing half drawn frames to be output like we see on current gen.

So Destiny will definitely be gimped in some form on XB1. The devs have no choice.

Kreten

Just like edram was too small to make a difference? Even though esram is 3-4ns and gddr5 is 300-400ns which means esram can do 30x-40x things while gddr5 does one?
You just have no understanding of how stuff works so stop commenting

Kreten

Simply not true. CPU on x1 is coherent with higher bandwidth to memory by 10GB/s and it operates at higher frequency.
Gpu on ps4 does not have move engines and has to compensate for lack of full audio chip.
Esram is not too small, how do you think that they did 1080p with x360′s 10mb of edram which is just downgraded esram. Why is it with you sony fanboys and these fake journalists never do your own research? Could do 1080p with 29GB/s ddr3 512mb +60gb/s edram 10mb but can’t with 68GB/s ddr3 5GB+204gb/s esram 32mb? Where is your common sense?

Not A PC Gamer

All the technical nonsense you spewed aside.. You don’t have a brain in your head if you are still supporting these clowns at Microsoft. I keep reading your big mouth comments bashing the PS4 when it’s just so much better in every regard than all these nonsense features and gimmicks of the X1

In the future rest assurd you will be paying for another XBL price hike to for those corporate criminals to try to justify more free apps and you can enjoy that DRM which is set to return later this year.

I guess some people are that Naive and Gullible to still trust the Xbox brand after all the damage they have done. You are definitely one of these Biscuit brain corporate Slaves that I’m referring too.

Even major Nelson can’t be trusted but morons still believe MS has their best interest when they don’t. Seriously get a friggen clue.

justerthought

The move engines on XB1 are to give the ESRAM data cache coherency with the main RAM, when being fetched by either the CPU or GPU, similar to HUMA. They do not boost speed, just manage data flow, integrity and decide which data is best served by the ESRAM speed bump. Only 32mb of data at any given period in time can benefit from the fast ESRAM high speed cache, so it’s limited to rapid small data streams or rapidly tiling larger data structures.

The PS4 has full HUMA cache coherency between its fast GDDR5 RAM and the GPU. The high speed transfer rates are available to all data at all times without resorting to lag inducing tiling methods. Even fast tiling takes time. Games such as open world that require a lot of data very quickly on the fly will suffer from slow large volume data in the DDR3 RAM and fast tiled data from an ESRAM cache.

I have to ask where is your common sense when all the proof is in the games. Every new cross platform game that comes out is proving my point. They are not proving your point are they. Who benefits from the numbers you’re quoting. Definitely not the gamers.

Guest

And that article is ONLY correct when using ONLY eSRAM. The Xbox One is designed for combined use of DDR3 and eSRAM and when MS finishes the SDK, then it will outperform the PS4 in the framebuffer department. The One Eighty screwed up their development timeline pretty bad.

angh

not really, because gddr5 is much faster than ddr3. Simply esram can’t hold enough data in 24 bits per fram, so it’s not good enough, and while keeping this data there the rest of communication is going through standard ddr3 controller. PS4 have direct communication between gpu and gddr5 and this can’t be outperform in full HD by using too small esram.

Guest

Incorrect again, you can combine the framebuffer over both memory pools. MS spokesperson Albert Penello made that very clear that the system allows it. Their software API’s just aren’t there yet to make that happen and until then the developers either have to code it themselves (very hard to do), drop to lower resolutions/color depth or wait for the XDK to be updated. Both systems have DMA, not just the PS4.

datdude

Stop your nonsense man. 720p for next gen titles is unacceptable, period. It really shouldn’t be difficult to achieve 1080p on the hardware regardless of the architecture utilized. The fact that there are problems indicates Microsoft may have made a misstep in design, because it’s not only third party devs having problems, but Microsoft’s own first party studios are also struggling with it. I assumed Microsoft would throw all that xbox live money they’ve been raping their customers for to ensure hardware superiority heading into next gen, but I was way off. Instead, it’s been Sony that has stepped it up and come to the party with the significantly superior hardware.

Dakan45

You stop your nonsen dipshit, running game under 1080p is unacceptable therefore bf4 is SHIT on ps4.

Your logic you sony fag, nothing less than your shitty logic.

Now die bitch.

datdude

I don’t carry on conversations with tools who can’t even spell, sorry. Go back to class toddler, you’re wasting my time.

Dakan45

says the faggot who trolls like a motherfucker all the time.

How about you go suck some more sony cock.

datdude

I reiterate, go back to class toddler, you’re wasting my time. And because you’re an ignoramus, I’ll define reiterate. It means to say something again, to emphasize and clarify for the mindless among us. That would be you.

Dakan45

Now i reallize why you never post anything, because you sound like a fucking pretentious retard who thinks himself as a scholar.

Sorry for wasting your time sire, might as well enjoy being a pretentious duchebag like a sir that you allways go on about.

datdude

I reiterate, go back to class toddler, you’re wasting my time. And
because you’re an ignoramus, I’ll define reiterate. It means to say
something again, to emphasize and clarify for the mindless among us.
That would be you.

Dakan45

spoken like a true duche, what were we arguing about, oh yes that you are a gigantic sonytard that insulted the “dude” as you called him and yet now you pretend to be some high class duche.

Nope, not you are not getting out of it pal.

datdude

This poor bastard can’t even spell douche. What an asshat. Good luck with your obvious mental deficiencies.

Dakan45

you are not putting much of an effort in your trolling kiddo.

I doubt you even know what a typo is, it MUST be spelling mistakes.

Fucking retard.

angh

If you combine esram with ddr3 it won’t be a FB anymore. I’ve heard quite a few spokespersons from MS for last few months and they have been fired (sorry, voluntarily changed job because of ‘better offer’) quite frequently, so I will wait for the real application to see if he is right.
And yes, DMA is supported by both systems, but in Xbox One to access memory you have to use OS layer as in normal pc, in ps4 you can access ram page directly by its address.

Kreten

How do you think they were accomplishing 1080p with 10mb of edram? Maybe by using both memories for each frame? Which gives you wayyy higher bandwidth.
The memory access is low level so not sure what you’re trying to say as direct access, you still need the driver in between api and hardware otherwise devs have to code for each memory block seperately which would be very complex. Pc does not need os to access memory it needs driver exact thing that x1 is using and those are two diff things

neko working

lol, you believe a PR ? what a joke.

Guest

Resolutions are fine, just drop to 16bpp if you only use ESRAM. If you combine with DDR3, then there is NO problem. Theonly problem NOW is that MS doesn’t have a properly finished XDK for developers to do this. The load balancing features are theirs to implement until MS finished all the software pipelines.

angh

16bpp is 65k colours. It’s like Amiga computer ages ago. They won’t go to 1080p, but they will stay in 720p to keep up speed. Don’t forget that Xbox One is using less powerfull GPU as well.
Surelly we will find better software solution with time to cope with esram limitations. But similar solution will improve competitor console. Microsoft was sure that Sony will go with 4GB ram, and Sony changed it to 8 at the very last moment. That was what changed the balance.
And again, the impact on games won’t be huge. A little bit more jaggiest on xbox one, but in the heat of fight it won’t be noticeable. Xbox have other features apart of games that will make up to it anyway.
I’m getting ps4 myself not because of hardware difference but just because of games line up. I like to play a good jrpg’s from time to time and last 4 years clearly shows that this type of game is rare on x360 (but this few which were released on it were quite nice). Most multiplatforms I will play on PC anyway, and I’m not watching TV at all so those extra xbox one features are not for me.
I’m just glad that competition is there and thanks to that we are getting good prices and features included. And I hope that this will stay that way for long time.

Kreten

They did 1080p with x360 with 10mb of esram and x1 memory layour is pretty much the same, you don’t have to have the whole render target in esram it can be across two memories or they can do tiled resources what the machine was made for. X1 api simply was not out as long as ps4′s due to it having win8 apps stuff, cloud stuff, esram stuff etc. but some big devs working on both consoles say ps4 is more complex due to its api which xbox ones is very similar to direct x they use for pcs

angh

They surely can do it. They simply do’t have to put frame buffer into esram. But they loosing much bandwidth on doing it in otherwise.
And cloud stuff is simply web database. upload data, download data, what does it takes to programm it up? BNoth Sony and MS started this generation consoles development in same time, and MS is software company, yet they cant create tools for their own machine? that’s a joke. And I would like to see who said ps4 have more complicated API?

chris

another loser writing another loser article

HalfBlackCanadian

I’m not a tech head (when it comes to PC rigs) and frankly I don’t care until we’re talking sub-720p or sub-30fps but IF I took these exact same specs and put them in a case with a fan and power supply, is it absurd to think that I could play these games at 1080p? Hope people understand what I’m asking, I just find it difficult to believe that Xbox One will have that many issues it’s entire lifetime achieving 1080p/60fps with parity to PS4 (or close to)

Maxcer Maxcer

the problem with that logic is as developers get around the hurdles of developing on XB1 the same will be happening on PS4. I think we’ll see both improve at the same rate, so the gap will most likely stay the same.

Guest

Not when MS finished the SDK, then the 192-200+GB/s >>> 176GB/ of the PS4.

Maxcer Maxcer

keep telling yourself all that theoretical power will be actually used one day. it could be, say 5-6 years down the line by a team dedicated to figuring it out. just in time for new hardware to be released.

face it, MS didn’t target the higher end hardware this time, they wanted an all-in-one media box and that’s what you got.

incendy

Xbox One is capable of achieving the same if not better bandwidth, but it will take developers optimizing the memory loading and caching for using the smaller amount of ESRAM. PS4 is using a model similar to the PC’s of today so I think with PC ports especially you will see a difference in the beginning. Once developers come up with reusable methods that take away the complexities of the using ESRAM we will probably see very similar results across the platforms.
Also DX11 has some features like Tiled Resources that are probably way too new for most engines to incorporate that heavily benefit the Xbox One architecture.

Joseph Lan

PS4 has much greater graphics processing capability than the Xbone. On top of that, the PS4′s memory setup is more flexible and easier to utilize. These are reasons why Xbone will never achieve parity with its PS4 multiplatform counterparts, unless developers purposely gimp the PS4 version. The latter wouldn’t likely be the case with Destiny, since PS4 is Bungie’s lead platform for development.

Right now, the most likely scenario for the future of Xbone games is that we’ll see mostly sub-1080p games, with only a few rare exceptions.

Kreten

Dude you have no idea what you are talking about. Ps4 has 14:4 CU’s 14Graphics 4 GPGPU at lower mhz. And doesn’t have full audio chip or move engines. Which puts graphics rendering on par. As for memory ddr3+esram is faster than gddr5 by about 100GB/s. Have you not seen some major devs say that overall ps4 is more complex due to its API? And x1 being very similar to pc in that aspect? Using same tools pretty muc.

Maybe a reason for sub 1080p was due to less time with x1 API? Took MS longer to make it as it has 10x more things in it than ps4 api. So quit throwing nonsense and use your own head to think. Comparing console to pc gpu will never work as for pc you need 2x the spec to hit the same result due to no optimisation because consoles only have one setup while pcs have thousands. And even on pc gpu that you specified is more than capable of 1080p 60fps. Cod is not even demanding game forza is more demanding yet it’s 1080p60fps. BF4 being 900p is more crisp on x1 and has more detail than ps4 version and has higher draw for distance where on ps4 you see fog. X1 does have more aliasing though. But resolution is not the only thing that makes the game the IQ and FPS are even more important. While COD is only running properly on X1 and x360 it’s even having issues on pc.

Guest

MS just needs to finish their DirectX implementation on the XDK pronto, then the problems for developers will be gone.

justerthought

That is a software issue that can be cured with an update. Hardware issues are cast in stone for the rest of the consoles lifespan. XB1 has a lot of lame hardware issues.

Jedi Master Domi

Dx12 unlocks the X1 CPU cores it’s not going to boost graphics as much as they say , sorry. Graphics are GPU based

Vartazian

But DX12 will only improve CPU overhead and improve the amount of calls that the CPU can make at once. It will not and cannot improve the Graphics processor itself. It may help the xb1 get a few more frames due to less cpu overhead, but the graphics processor is still the same. Nothing magic here. *flys away*