A certain corner of the game-focused Internet has been busy counting pixels this week, scrutinizing statements, screenshots, and videos for evidence that either the PlayStation 4 or Xbox One is providing a clearly superior graphical experience at launch. After examining all the available evidence, it seems clear that the PlayStation 4 versions of launch games like Battlefield 4 and Call of Duty: Ghosts enjoy a slight graphical advantage over their Xbox One cousins. It also seems equally clear, to me, that the difference just isn't that big a deal—unless you plan on playing games while looking through a magnifying glass.

The brouhaha really got going on Tuesday, when Digital Foundry posted an analysis of the footage it captured from the PC, PS4, and Xbox One versions of Battlefield 4 during a recent review event. Their capture setup determined that the Xbox One version was running at 1280×720, compared to a 1600×900 resolution for the PS4 version, both at 60 frames per second. While these weren't the final release candidate versions of the game being tested, the resolutions are likely to be consistent in the final games despite an earlier promise by DICE to target "equal performance" on both consoles.

(Members of the PC master race will be happy to know that the Windows version of the game ran at 1920×1080 resolution on "Ultra" settings, besting both consoles handily. The console versions were most comparable to the PC game running at the PC's "High" graphics quality, Digital Foundry said.)

The next volley in the resolution wars came yesterday, when Infinity Ward's Mark Rubin confirmed over Twitter that the Xbox One version of Call of Duty: Ghosts will be running at "1080p upscaled from 720p," compared to a "native 1080p" resolution of the PlayStation 4. These tidbits seem to have a small but loud corner of the Internet convinced that Microsoft's system is overpriced and underpowered, incapable of keeping up with the PlayStation 4.

The reports certainly sound like a big deal, with Sony's system pushing 50 to 100+ percent more native pixels than Microsoft's on identical launch games. Try as I might, though, I can't get too worked up over what seems like an incredibly minor difference in practical graphical output.

Before we continue, you should probably watch the direct Battlefield 4 comparison video Digital Foundry posted (embedded above). Make sure you push it to full screen and full resolution to get an accurate comparison. It should be clear that you can make out differences in the two images if you're looking for them, especially if you get up close to your monitor and focus on the edges of certain objects.

Examining the video a foot or two away from a PC monitor doesn't really mimic the way console gamers play games, though. For that, you're going to have to back up from your monitor at least a couple of long paces. Watch the video again from this farther vantage point. Can you still make out the differences? Even if you can, are they as significant?

Whether a gain in output resolution is noticeable to the human eye depends on three things: the pixel count, the screen size, and, crucially, the distance from the screen. The value of an increase in raw pixels goes down as the screen size gets smaller and as you get farther from the display.

Digital Trends has calculated the distances and screen sizes where various resolutions actually matter. If your living room TV is 10 feet away from your seat, you need a TV a bit larger than 50 inches to notice the difference between 720p and 1080p. If you're 12 feet away, you need a screen larger than 60 inches.

Unless you sit really close to your really large TV, the difference between 720p and 1080p just isn't that noticeable.

Viewing distances aside, we're reaching a point of somewhat diminishing returns when it comes to improving a gaming image just by throwing more pixels at it. Back in the '80s, the jump in resolution between the Atari 2600 and the NES was about the same pure pixel ratio as the jump from 720p to 1080p, but it provided a much more noticeable effect on image quality (even if you discount the NES' wider simultaneous color palette and larger character sprites). The jump from 720p to 1080p is much less noticeable, even up close, than the jump from 480p to 720p that made Wii games look like muddy, washed-out relics compared to their Xbox 360/PS3 brethren.

It's hard to look at the Xbox One's technically "inferior" 720p output with the same kind of practical concern as those inter-console resolution comparisons of the past. Resolution aside, the games look practically identical, with similar textures, apparent polygon counts, frame rates, and particle effects (like smoke). The small aliasing difference due to the resolution pales in comparison to the similarities in the overall look and feel of both versions.

That's not to say the differences aren't there; it's just that they're not all that significant to my eye. As Digital Foundry put it in its analysis, "the Microsoft console manages to hold up despite the undeniable, quantifiably worse metrics in terms of both resolution and frame rate."

You should be wary of reading too many long-term performance concerns into any resolution differences in launch titles, as well. The evolution of graphical quality on a system has at least as much to do with optimization and allocation of developer resources as it does with raw hardware specs (where the Xbox One and PS4 are quite similar). Furthermore, the overall look and feel of a game lies heavily on overall art direction and craft than raw numbers anyway. Games like Super Mario 64 and Shadow of the Colossus still hold up to this day, despite being in standard definition, thanks to the strong aesthetic sense behind them.

None of this is to ignore the actual differences in resolution between the PS4 and Xbox One versions of at least a couple high-profile, multiplatform launch games. If you're the kind of person who isn't happy unless his gaming rig generates the highest raw benchmark numbers, the PS4 seems to be your console of choice for the time being (though, really, a high-end PC still wins out on this score). If you're the kind of person who values actual gameplay, though, choose your next console based on the games. You can feel secure in the knowledge that, graphically, there doesn't seem to be much practical, noticeable difference in performance.

Promoted Comments

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Kyle, you seem to be someone who usually goes out of his way to get the details right and makes sure to point out flaws that other people gloss over, so I'm not sure where this article came from. You seem to be trying really hard to ignore a pretty large performance difference between the two consoles.

You say that this being an early release means the importance isn't that great since things will be better optimized later on, but I don't get that logic. With both consoles having extremely similar hardware, I think it's safe to assume that both consoles will have similar optimization headroom later on in their life. If that's the case, the PS4 would still be getting much better performance in the late stages of the console's life.

Also, the whole "720p to 1080p only makes a big difference if you are really close" argument is just complete rubbish. A LOT of people sit within five feet of their TVs, especially while gaming. Lower resolutions can really take you out of the game, especially in slower paced areas when you have time to really look at things.

I'm not trying to ignore it... I point out multiple times in this article that it is real and is there. I'm just stating my personal opinion that, practically, it does not make a huge difference in my subjective appreciation of the two scenes, based on what I have seen.

Whether both consoles get "similar optimization" depends largely on which console is more popular, I think. The Xbox 360 got a lot more attention from a lot of developers because of its sales lead in the West, and therefore was able to stretch a lot farther on objectively worse hardware for quite a while when compared to the PS3 (Sony's hardware advantage has arguably won out in the highest end late-generation games, though)

Also, I consider 5 feet to be "really close" for gaming. A quick, informal Twitter survey I just did found almost everyone who answered was 8-10 feet or more from their TVs when playing. Not at all scientific or anything, but it suggests 5 feet isn't the norm. In any case, the fact that some people play close doesn't mean the "720p to 1080p only makes a big difference if you are really close" argument is invalid, just that it doesn't apply to those people.

25 years later, and people are still arguing which console has more sprites, more colors and more parallax scrolling. Except now it requires detailed side-by-side comparisons, professional video captures and the scrutinizing of individual pixels on a 27-inch monitor.

I almost miss the days when the technical differences actually resulted in gameplay and playability differences (SF2, anyone?). At least then, there was a point to that discussion.

353 Reader Comments

I feel like this is an oversight. I don't care a whole lot about the resolution itself - more what it implies. Microsoft talked up balance and whatnot, but the resolution and framerate differences make it very clear the hardware is significantly different. The resolution maybe the only difference now, but later that may mean better effects, antialiasing, etc at the same resolution (ie when both do 720p or 1080p in the same game, one will have much more power per pixel). Let alone first party games.

Yes, optimizing around the eSRAM may be a factor, but the ROPs, TMUs, compute units all speak for themselves.

And oh, before anyone else is confused, that sharpness you see in the XBO version? That's actually from reduced RGB range in the game which will be fixed shortly, it may look poppier but it's actually crushing blacks (dark grey becomes total black), so despite looking punchier it's actually only removing detail.

I'm curious about the power consumption of the two systems. I couldn't make out much of a difference (other than colors, which actually seemed more vivid on the XB frames), so if the PS4 is wasting a ton of juice to squeeze out those extra pixels, it might even be a detriment.

Not that it matters. My mind is already made up on this console generation.

I think it will basically come down to bragging rights. Does it sound like a dumb thing to brag about...yes, but give the maturity of a few but vocal players of certain FPS games, any point will suffice.

I suppose its possible that a truly professional gamer may be able to see a teeny tiny minor difference? Or at least claim to?

Personally I've been happy where graphics and resolution are. If it gets better, thats nice but I'm not asking for it.

A sober perspective on console resolutions. As someone who primarily plays on the PC and very little on consoles, I don't have a dog in this fight, but appreciate measured reasoning about experiences in an industry obssessed with benchmarks, metrics and big numbers.

It may not matter much now, but there is no way having better graphics performance is a negative, and it's one more positive for the PS4 compared to Xbox One - as if it needed any more to convince gamers.

If the chart is accurate, then for a borderline case like 50" TV at 9.9 feet, 1080p "may or may not" be noticeable.

But if you take account into distribution, then it actually means approximately 50% people will notice a difference. For a case like 50'TV at 11 feet, you cannot argue 1080p is not noticable to everyone - even if it is in the 720p zone according to the graph. There is no magic line to cross that will flip the switch for everyone. Some people can notice a difference further away, and some cannot notice a different no matter how far or close.

50" TV at 10-12 feet away is not a really extraordinary setting. If you tell me I have a 30-50% chance to notice a difference, I will definitely pick the one that can produce 1080p if they are equivalent in other aspects. (price, feature etc). if you have more than two people and each has a 50% chance of noticing the difference, then you have a 75% chance of at least one person can. For four people, this become 94%.

This is just like, for Bill Gate a $100 difference probably means nothing, but to many people out here it does mean something.

While I know that the graph was intended to address television viewing distances and screen sizes, I can only assume that it would apply equally well to understanding what is "worth it" when considering optimal PC monitor resolutions. And what that tells me is that Ultra HD would, in fact, make a significant difference for the PC gaming experience (contrary to what I've heard that the difference would not be noticeable).

"If you're the kind of person who values actual gameplay, though, choose your next console based on the games. You can feel secure in the knowledge that, graphically, there doesn't seem to be much practical, noticeable difference in performance"

if graphics were my main factor to pick a console, i would had not picked the PS1, PS2, dreamcast, wii, etc, what a great catalog of games i would have been missing out becuase of this. I believe for avid gamers graphics is not the sole reason, it is important but it isnt the only deciding factor, good games are, and both consoles will have great games, i like good graphics like any other gamer, but missing out a console just because it has a graphical disavantadge, is really silly in this day an age, especially when indie games, PSN, arcade games, which are not graphics power houses, clearly indicates that artistic graphics can really make good looking games instead of raw power.

Also i like to mention that cell shaded games are the ones that can stood the test of time, better than any of the most impresive looking games for each generation, so it is the presentation and artistic desing that is important as well as raw power.P.S i will pick both, alongside my WiiU

The general impression I got from viewing the Xbox One vs PS4 specs is the PS4 is more straightforward, while the Xbox One requires more tailored tinkering to optimize it for that platform.

If that's the case, then it makes sense the launch window titles will favor the more straightforward console--I would expect the performance will prove to be more similar as developers get more familiar with how to eke out better performance from each box. I don't know if they'll gain perfect parity, but I think they'll get closer--and close enough is good enough for me!

Looking at the video it was pretty obvious that the PS4 version had some sort of extra filter or overlay that the XO didn't have. I can't tell if this effectively masked some of the digital artifacts that were already present (and visible on the XO), or if this represents a better computational ability.

Its not about the difference between 720p and 1080p, which doesn't matter. Its about the fact that if the PS4 can output the same quality as the Xbox One but at a higher resolution, imagine what it could do at the same resolution. More effects, higher detail, better models (more models before you have to cull) etc... There is no way you can spin this as anything other than a tangible benefit.

For me, having seen the videos and coming from an 360, I can't wait to get the Xbox One. These small differences aren't going to make a big deal to my eyes. Plus, I really want to play Titanfall.

Personally, I'm more jazzed for the change in game play options for both consoles. Sure the graphics are better, but a game like Battlefield 4 will now have feature parody with PC. That's the where the real fun of this next generation will come from.

I would go with a PC but it hurts like heck when I try to play a PC game. After working all day at a keyboard and mouse, my hands just need a change.

I'm also in the minority of core gamers that actually want the Kinect.

If it bothers you that much, then they're all straight lines. I'm sure you can extrapolate to 1000" screens if you wish.

Anyway, getting back to the plot, the hardware of the two machines is so damn similar that differences are going to be down to how much overhead is incurred in each system's APIs, and how well the developers have gotten to grips with programming against each API.

In the long term, it'll probably prove to be a wash unless one system's API is woefully inefficient.

For me, having seen the videos and coming from an 360, I can't wait to get the Xbox One. These small differences aren't going to make a big deal to my eyes. Plus, I really want to play Titanfall.

Personally, I'm more jazzed for the change in game play options for both consoles. Sure the graphics are better, but a game like Battlefield 4 will now have feature parody with PC. That's the where the real fun of this next generation will come from.

I would go with a PC but it hurts like heck when I try to play a PC game. After working all day at a keyboard and mouse, my hands just need a change.

I'm also in the minority of core gamers that actually want the Kinect.

Same here. Sure, I'm certain that there's some stuff the PS is better at than the XO, but in the end, I just don't care enough. And, I love my Kinect.

Oh, so since the Xbox 1 pushes significantly fewer pixels, I guess it must be a bit cheaper than the PS4 right?

...wait, what do you mean no??

It is actually pretty disappointing that these new consoles can't even render a game like BF4 at 1080p natively. I think the bigger story here, as has been mentioned before, is that the graphical capabilities of both consoles is lagging the historical trend. A more meaningful comparison would be console BF4 vs. PC BF4 (on a 50" TV, if you prefer). That would be a better test of whether graphics are hitting diminishing returns, or whether both PS4 and Xbox1 are underpowered (but the PS4 being somewhat less so).

If the graphics REALLY don't matter then maybe Sony should have used cheaper hardware and hit a $250 price point or something.

As a PC gamer this largely irrelevant to me. That being said, I wonder if the superior technical performance of the PS4 may become more significant 4-5 years from now when, as in the final days of the last console generation, developers were having to squeeze every last possible bit of performance from the respective consoles in order for their games to appear "modern" graphically.

Perhaps the latest X Box will have a shorter 'acceptable performance' lifespan as a result.

For me, having seen the videos and coming from an 360, I can't wait to get the Xbox One. These small differences aren't going to make a big deal to my eyes. Plus, I really want to play Titanfall.

Personally, I'm more jazzed for the change in game play options for both consoles. Sure the graphics are better, but a game like Battlefield 4 will now have feature parody with PC. That's the where the real fun of this next generation will come from.

I would go with a PC but it hurts like heck when I try to play a PC game. After working all day at a keyboard and mouse, my hands just need a change.

I'm also in the minority of core gamers that actually want the Kinect.

Same here. Sure, I'm certain that there's some stuff the PS is better at than the XO, but in the end, I just don't care enough. And, I love my Kinect.

If anything, the Xbox One presentation overall was actually slightly sharper than the PS4 despite the reduction in pixel count. Microsoft claims their scaler on the One is better than that on the 360, and as someone who runs a 360 into a Sony Bravia at 720p because the (much newer) TV does a better job scaling to 1080p, I believe this.

However, there are more visible aliasing artifacts in the Xbox One presentation. This is disappointing, since I had hoped that AA would become a standard feature of next gen titles, instead of the decidedly hit or miss nature of the current generation. Hoping this (and the resolution limits) are just teething pains for a new system that may be a bit harder to develop for than the PS4.

Much more concerning to me are the visibly crushed blacks in the One. This could be due to capture issues, but I do see this from time to time today on the 360 (most heart-achingly, in Mass Effect 3, the presentation of which is almost unplayable on the Xbox). I never have this issue in PS3 games, and Sony generally just seems to get HDMI, which is not a surprise given their long history of making TVs and Blu-Ray players. Really hoping this is not a characteristic of the One itself.

We currently run both 360 and PS3, and we have two Xbox Ones on pre-order because for us, the integration of all the TV and entertainment stuff really is important. I still haven't decided on whether to get a PS4 yet, but certainly this stuff would not be the main driver, it would be games like Uncharted and the Last of Us, that must be played.

Personally, I'm more jazzed for the change in game play options for both consoles. Sure the graphics are better, but a game like Battlefield 4 will now have feature parody with PC. That's the where the real fun of this next generation will come from.

Sure, but isn't it likely that if BF4 didn't have a console version and had remained PC only (like the older Battlefields) that it would have more features than it does now? It's quite likely simplified (or "dumbed down" if you prefer) just to "fit" on consoles, likely both for hardware considerations and gamer playstyles. I think you could say the same for BF3.

And what that tells me is that Ultra HD would, in fact, make a significant difference for the PC gaming experience (contrary to what I've heard that the difference would not be noticeable).

Not to derail the thread too hard, but you are correct. Check AnandTech for more information. They have several write-ups on 4K gaming (and the commensurate hardware required to make it anywhere near playable...brace yourself).

It may not matter much now, but there is no way having better graphics performance is a negative, and it's one more positive for the PS4 compared to Xbox One - as if it needed any more to convince gamers.

Higher resolutions require higher resolution textures which require a shit ton more work on the development side which means more expensive games, less low-budget titles, longer development times, and all manner of problems. On the customer side the system will almost certainly eat more power and run hotter, which has a (very) small possibility of raising the console failure rate (if you recall from the 360's RROD problems, heat and consoles don't get along well).

This has nothing to do with either console, honestly - I don't plan on getting either for at least a year after release and I am in fact leaning towards the PS4. But to say there's no possible drawbacks is just plain false.

I recently had an opportunity to trial a 4k TV. I would have to disagree with the chart on that one. It seemed to me that at any distance, the image was so dense and so rich that I was honestly surprised. Perhaps some of the experience was due to the vivid color saturation the unit gave, but things were so danged real looking, at any distance. I was watching 4k source material too, so I am sure that was a big part. It made me want to lay out the cash to get one though where I wouldn't have considered it before as I thought of it as a straight up resolution increase. Its not.

Kyle Orland / Kyle is the Senior Gaming Editor at Ars Technica, specializing in video game hardware and software. He has journalism and computer science degrees from University of Maryland. He is based in Pittsburgh, PA.