Thief reboot runs at 1080p on PS4, 900p on Xbox One

"You really need good eyes to see the difference," says developer.

The bad news for the Xbox One's graphical capabilities just keeps on coming this week. Just one day after Konami announced that Metal Gear Solid: Ground Zeroes will run at 1080p on the PS4 and 720p on the Xbox One, Square Enix has confirmed to Eurogamer that the upcoming reboot of Thief runs at a native resolution of 1080p on the PS4 and 900p on the Xbox One. Both versions reportedly run at 30fps.

The difference mirrors similarly sized cross-platform resolution gaps in the PS4 and Xbox One versions of games like Battlefield 4 and Assassin's Creed 4. But game director Nicolas Cantin told Eurogamer that he thinks the slight variation "doesn't make much of a difference."

"You really need good eyes to see the difference," Cantin told the site, adding that the Xbox One game "is as good as the PS4 version." He went on to say that the resolution difference was interesting more as an engineering issue than a gameplay issue, and he noted that both the PS4 and Xbox One are vast improvements over the editions on older systems.

Whether or not Thief looks noticeably worse in practice on the Xbox One, the steady stream of Xbox One games that look worse on paper than their PS4 counterparts is not what Microsoft needs in order to set its system apart from the competition.

Promoted Comments

"You need really good eyes" to see the difference between 1080p and 900p? I guess my friend, who has glasses, has "really good eyes", along with every other gamer I know, for that matter.

It's like a scratch on a new car... to the owner, who knows where the scratch is and because he/she has a vested interest in the vehicle, the tiniest scratch on the car can look like someone hit the car with an axe. To everyone else, the scratch may as well not exist because they won't see it. Of course, every time the owner walks near the scratch, the scratch will be a huge glaring gash because his/her eyes will automatically gravitate to the scratch.

Am I the only one who doesn't give a Damn what resolution a game runs at? As long as it looks good on the TV and doesn't suffer from performance problems. The Art Direction/Art Assets etc, have a much bigger influence on 'graphics' than ~180 lines of resolution.

Oh and game play. That matters just a tiny bit too.

Based on these articles, and comments. It seems like I could publish a TURD of a game and sell it to gamers as long as it boasted 1080p and 120FPS.

Unless you're using a home theater projection setup and have a 100"+ screen do those extra 180 lines of resolution really matter? For most folks sitting across a living room from a 40-55" flat panel I can't imagine it does.

This isn't news. The Xbox One is graphically weaker than the PS4 and every new game keeps confirming it. Whether that matters or not is up to the buyer, but that fact is not even up for discussion anymore.

Because otherwise they would have to downscale to 720p; most consoles run at 30fps, it also effects things that TVs tend to refresh slower than PC monitors just from the size diff alone.

Unless you have a high end TV, you will see some tares if you try to hammer it with a constant 60fps.

You have to have a rather decent TV to make use of the 1080p or 900p resolution anyway. I can't believe any of them would really suffer from being fed 60 FPS. TVs are often slower latency-wise, but not refresh rate wise.

"You need really good eyes" to see the difference between 1080p and 900p? I guess my friend, who has glasses, has "really good eyes", along with every other gamer I know, for that matter.

Bullshit. There is actual science here that demonstrates that you're largely full of it. Go to the part about Human Visual System Limitation. Yes, there are definitely screen sizes and viewing distances where the difference between 1080p and 900p can be noticeable, unless you have precisely set up you gaming seat to live in that range, you won't be able to tell the difference. Why? Because the physiology of your eye is the limiting factor.

That said, It's disappointing the XB1 isn't getting 1080p, but all other games on mine look a big step up from last-gen, and so far that's been plenty.

Because otherwise they would have to downscale to 720p; most consoles run at 30fps, it also effects things that TVs tend to refresh slower than PC monitors just from the size diff alone.

Unless you have a high end TV, you will see some tares if you try to hammer it with a constant 60fps.

I don't own any current-gen consoles so I'm not in the know I guess, but my assumption about the current gen was that it was going to regularly achieve 60 fps at 720p/1080p resolutions. Decent computers have been doing that for awhile now so I figured it was the consoles' turn now.

I don't really know the issues of TV vs. monitors but I would imagine that even if the TV is the chokepoint you'd at least want to *say* that your game can hit 60 FPS and maybe just lower the framerate for syncing issues.

This isn't news. The Xbox One is graphically weaker than the PS4 and every new game keeps confirming it. Whether that matters or not is up to the buyer, but that fact is not even up for discussion anymore.

It's actually not graphically weaker. Xbox One uses a different type of memory that is similar to what the xbox one had. The ps4 on the other hand is pretty close to a stock pc. It's simply that the developers need to learn how to use the xbox one a bit more whereas the ps4 version they can basically expect normal architecture. This is a problem that will disappear with time and experience.

*sigh*

Yes it is. XB1 GPU has less shading power, less texturing power, less render outputs. It may have a slight memory bandwidth edge in some contrived cases (which optimization may make more common), but in all the cases where bandwidth isn't the limiting factor, as well as those where the onchip memory is too small, it is slower.

Because otherwise they would have to downscale to 720p; most consoles run at 30fps, it also effects things that TVs tend to refresh slower than PC monitors just from the size diff alone.

Unless you have a high end TV, you will see some tares if you try to hammer it with a constant 60fps.

That's... not really accurate.

30 FPS requires less than half of the power you need to hit 60 FPS (as the amount of power you need to hit 60 FPS versus 30 FPS is NOT linear due to fixed per-frame overheads). As such, developers can pull off a lot more graphical niceties at such a framerate than if they shoot for the full 60 FPS. Of course, you can pull off even more stuff if you drop the resolution further and accept a sub-30 FPS framerate like in the case of The Last of Us... personally, that's going a bit too far for my tastes.

The screen tearing occurs if V-sync isn't enabled due to frames being updated in the middle of a frame being sent to the TV. Aside from G-sync (which is proprietary to Nvidia at the moment and only on specific screens), there is no perfect solution to this problem as V-sync increases input latency. Whether or not your TV is high-end, or whether or not the frame rate is 60 is immaterial - if there is no V-sync, there WILL be tearing.

Because they can't make it run at 60 fps at the resolution that they want. And bouncing somewhere between 30 and 60 introduces all the fun hitching and tearing that we PC gamers have been dealing with for ages. Locking it at 30fps is a perfectly adequate solution on consoles.

"You need really good eyes" to see the difference between 1080p and 900p? I guess my friend, who has glasses, has "really good eyes", along with every other gamer I know, for that matter.

It's like a scratch on a new car... to the owner, who knows where the scratch is and because he/she has a vested interest in the vehicle, the tiniest scratch on the car can look like someone hit the car with an axe. To everyone else, the scratch may as well not exist because they won't see it. Of course, every time the owner walks near the scratch, the scratch will be a huge glaring gash because his/her eyes will automatically gravitate to the scratch.

This isn't news. The Xbox One is graphically weaker than the PS4 and every new game keeps confirming it. Whether that matters or not is up to the buyer, but that fact is not even up for discussion anymore.

It's actually not graphically weaker. Xbox One uses a different type of memory that is similar to what the xbox one had. The ps4 on the other hand is pretty close to a stock pc. It's simply that the developers need to learn how to use the xbox one a bit more whereas the ps4 version they can basically expect normal architecture. This is a problem that will disappear with time and experience.

Memory is only one part of the equation, and the PS4 has a better GPU with more shaders.

Am I the only one who doesn't give a Damn what resolution a game runs at? As long as it looks good on the TV and doesn't suffer from performance problems. The Art Direction/Art Assets etc, have a much bigger influence on 'graphics' than ~180 lines of resolution.

Oh and game play. That matters just a tiny bit too.

Based on these articles, and comments. It seems like I could publish a TURD of a game and sell it to gamers as long as it boasted 1080p and 120FPS.

"You really need good eyes to see the difference," Cantin told the site, adding that the Xbox One game "is as good as the PS4 version." He went on to say that the resolution difference was interesting more as an engineering issue than a gameplay issue, and he noted that both the PS4 and Xbox One are vast improvements over the editions on older systems.

You know there's going to be people sayings things like being able to tell the difference between créme, beige, bone-white, and eggshell.

But really, these two consoles are not even one year old so there's going to be optimization issues until the developers get used to the new hardware, even they are made of PC-like components.

"You need really good eyes" to see the difference between 1080p and 900p? I guess my friend, who has glasses, has "really good eyes", along with every other gamer I know, for that matter.

Bullshit. There is actual science here that demonstrates that you're largely full of it. Go to the part about Human Visual System Limitation. Yes, there are definitely screen sizes and viewing distances where the difference between 1080p and 900p can be noticeable, unless you have precisely set up you gaming seat to live in that range, you won't be able to tell the difference. Why? Because the physiology of your eye is the limiting factor.

That said, It's disappointing the XB1 isn't getting 1080p, but all other games on mine look a big step up from last-gen, and so far that's been plenty.

I agree with you that to most this difference won't be noticeable in practice. But that's not really the important thing. To the "core" audience that both consoles appeal to, what matters is the perception. Microsoft had already dug itself into a hole with marketing missteps, and now as it becomes obvious there is a real and persistent performance gap, it will have a hard time digging itself out.

Xbox is selling well early based on brand loyalty and pent up demand, but future sales growth depends on appealing beyond the die-hard fans. Titanfall notwithstanding (which it appears they are very, very lucky to have), Microsoft could lose the segment of the "core" audience that it had captured away from Sony in the last generation. Who wants a "next-generation" game console that can't even do full HD games?

There's been a lot of negativity surrounding this game, but I'm pretty excited for it. I love stealth games; the more, the merrier. And there hasn't been a good light-and-shadow stealth game in a very long time.

Am I the only one who doesn't give a Damn what resolution a game runs at? As long as it looks good on the TV and doesn't suffer from performance problems. The Art Direction/Art Assets etc, have a much bigger influence on 'graphics' than ~180 lines of resolution.

Oh and game play. That matters just a tiny bit too.

Based on these articles, and comments. It seems like I could publish a TURD of a game and sell it to gamers as long as it boasted 1080p and 120FPS.

There's been a lot of negativity surrounding this game, but I'm pretty excited for it. I love stealth games; the more, the merrier. And there hasn't been a good light-and-shadow stealth game in a very long time.

There's been a lot of negativity surrounding this game, but I'm pretty excited for it. I love stealth games; the more, the merrier. And there hasn't been a good light-and-shadow stealth game in a very long time.And I will, of course, be picking this up for the PS4.

Wikipedia says this game is coming out for the PC, so I'll probably actually end up picking it up as well. I loved Deus Ex and people kept telling me to get the Thief series but I never bothered. So I feel obligated to try out this one.

... once Steam has a flash sale for it because I'm just so spoiled now.

Well the important thing is PS4 owners get to play "Downvote the XB1 explanations". Makes a change from ResoGun I suppose

I'm not a PS4 owner; I'm exclusively a PC gamer these days. But the "explanations" from XB1 fans seem to be more like rationalizations. The PS4 is simply a more powerful system, period. It isn't like in generations past, where the systems all used different architectures and it was thus hard to compare; the XB1 and the PS4 have very similar architectures, and in fact the architecture is pretty similar to an AMD-based PC. Both systems are using CPUs and GPUs from the same family, but the PS4 has a faster CPU and more cores on the GPU. That the PS4 is more powerful should surprise absolutely nobody, and no amount of optimization for the XB1's memory architecture is going to change that.

Both Sony and Nintendo learned the lesson of the Wii: you don't need to spend billions on hardware development to sell a console.

Yeah well Nintendo is learning that you should spend a little more that they did, and that "U" isn't a proper designation for a sequel/followup console. If it was Wii 2 I'm sure sales would have been 2-3x what they are now.

"You need really good eyes" to see the difference between 1080p and 900p? I guess my friend, who has glasses, has "really good eyes", along with every other gamer I know, for that matter.

Bullshit. There is actual science here that demonstrates that you're largely full of it. Go to the part about Human Visual System Limitation. Yes, there are definitely screen sizes and viewing distances where the difference between 1080p and 900p can be noticeable, unless you have precisely set up you gaming seat to live in that range, you won't be able to tell the difference. Why? Because the physiology of your eye is the limiting factor.

That said, It's disappointing the XB1 isn't getting 1080p, but all other games on mine look a big step up from last-gen, and so far that's been plenty.

Two reasons why this is bunk:1. Visual hyperacuity is a thing, and we are still not even close to the point where we have hit the limit of our perceptual abilities. Angular resolution is one thing, but that does not account for the ability of the human brain to discern even minute changes in motion, where our ability to discern detail is remarkably sensitive (likely due to how much it aids in survival, naturally). 2. We're not talking about video, we're talking about video games. Unlike video where the source is being sampled either from reality or from offline-rendered CG with dozens, thousands, or millions of samples per pixel, video games often suffer from taking one and only one sample per pixel and thus suffer from huge aliasing problems, including moire patterns, stair-stepped edges, shimmering, and even outright missing details. Where 1080p is a very nice resolution when it comes to camera-sourced video, 1080p is awful when you're rendering out computer graphics, unless you take at least a few more samples from every pixel (and hardly any console games do this anymore, favoring extra graphical features instead of better image quality). Again, taking offline-rendered CG as an example, most CG movies will take a huge number of samples for every pixel. Cars, as one example, renders out at a resolution of 2048x1536, and each of those pixels is sampled over 30 times (since it's ray-traced rather than rasterized it's hard to give an exact number, but that should give you a general idea). This application of sheer brute force in the rendering eliminates all the aliasing problems that still plague even the highest-resolution console games today. In the absence of powerful anti aliasing methods like TXAA or (the god of all anti-aliasing methods) SGSSAA, the only other solution is to simply raise the resolution until those artifacts are gone, well past the point of where our visual acuity can discern changes in the pixel grid.

When talking about resolution it is extremely important that we acknowledge the differences between normal video and rasterized, real-time computer games. The two are simply too different in actual practice.

Kyle Orland / Kyle is the Senior Gaming Editor at Ars Technica, specializing in video game hardware and software. He has journalism and computer science degrees from University of Maryland. He is based in the Washington, DC area.