A certain corner of the game-focused Internet has been busy counting pixels this week, scrutinizing statements, screenshots, and videos for evidence that either the PlayStation 4 or Xbox One is providing a clearly superior graphical experience at launch. After examining all the available evidence, it seems clear that the PlayStation 4 versions of launch games like Battlefield 4 and Call of Duty: Ghosts enjoy a slight graphical advantage over their Xbox One cousins. It also seems equally clear, to me, that the difference just isn't that big a deal—unless you plan on playing games while looking through a magnifying glass.

The brouhaha really got going on Tuesday, when Digital Foundry posted an analysis of the footage it captured from the PC, PS4, and Xbox One versions of Battlefield 4 during a recent review event. Their capture setup determined that the Xbox One version was running at 1280×720, compared to a 1600×900 resolution for the PS4 version, both at 60 frames per second. While these weren't the final release candidate versions of the game being tested, the resolutions are likely to be consistent in the final games despite an earlier promise by DICE to target "equal performance" on both consoles.

(Members of the PC master race will be happy to know that the Windows version of the game ran at 1920×1080 resolution on "Ultra" settings, besting both consoles handily. The console versions were most comparable to the PC game running at the PC's "High" graphics quality, Digital Foundry said.)

The next volley in the resolution wars came yesterday, when Infinity Ward's Mark Rubin confirmed over Twitter that the Xbox One version of Call of Duty: Ghosts will be running at "1080p upscaled from 720p," compared to a "native 1080p" resolution of the PlayStation 4. These tidbits seem to have a small but loud corner of the Internet convinced that Microsoft's system is overpriced and underpowered, incapable of keeping up with the PlayStation 4.

The reports certainly sound like a big deal, with Sony's system pushing 50 to 100+ percent more native pixels than Microsoft's on identical launch games. Try as I might, though, I can't get too worked up over what seems like an incredibly minor difference in practical graphical output.

Before we continue, you should probably watch the direct Battlefield 4 comparison video Digital Foundry posted (embedded above). Make sure you push it to full screen and full resolution to get an accurate comparison. It should be clear that you can make out differences in the two images if you're looking for them, especially if you get up close to your monitor and focus on the edges of certain objects.

Examining the video a foot or two away from a PC monitor doesn't really mimic the way console gamers play games, though. For that, you're going to have to back up from your monitor at least a couple of long paces. Watch the video again from this farther vantage point. Can you still make out the differences? Even if you can, are they as significant?

Whether a gain in output resolution is noticeable to the human eye depends on three things: the pixel count, the screen size, and, crucially, the distance from the screen. The value of an increase in raw pixels goes down as the screen size gets smaller and as you get farther from the display.

Digital Trends has calculated the distances and screen sizes where various resolutions actually matter. If your living room TV is 10 feet away from your seat, you need a TV a bit larger than 50 inches to notice the difference between 720p and 1080p. If you're 12 feet away, you need a screen larger than 60 inches.

Unless you sit really close to your really large TV, the difference between 720p and 1080p just isn't that noticeable.

Viewing distances aside, we're reaching a point of somewhat diminishing returns when it comes to improving a gaming image just by throwing more pixels at it. Back in the '80s, the jump in resolution between the Atari 2600 and the NES was about the same pure pixel ratio as the jump from 720p to 1080p, but it provided a much more noticeable effect on image quality (even if you discount the NES' wider simultaneous color palette and larger character sprites). The jump from 720p to 1080p is much less noticeable, even up close, than the jump from 480p to 720p that made Wii games look like muddy, washed-out relics compared to their Xbox 360/PS3 brethren.

It's hard to look at the Xbox One's technically "inferior" 720p output with the same kind of practical concern as those inter-console resolution comparisons of the past. Resolution aside, the games look practically identical, with similar textures, apparent polygon counts, frame rates, and particle effects (like smoke). The small aliasing difference due to the resolution pales in comparison to the similarities in the overall look and feel of both versions.

That's not to say the differences aren't there; it's just that they're not all that significant to my eye. As Digital Foundry put it in its analysis, "the Microsoft console manages to hold up despite the undeniable, quantifiably worse metrics in terms of both resolution and frame rate."

You should be wary of reading too many long-term performance concerns into any resolution differences in launch titles, as well. The evolution of graphical quality on a system has at least as much to do with optimization and allocation of developer resources as it does with raw hardware specs (where the Xbox One and PS4 are quite similar). Furthermore, the overall look and feel of a game lies heavily on overall art direction and craft than raw numbers anyway. Games like Super Mario 64 and Shadow of the Colossus still hold up to this day, despite being in standard definition, thanks to the strong aesthetic sense behind them.

None of this is to ignore the actual differences in resolution between the PS4 and Xbox One versions of at least a couple high-profile, multiplatform launch games. If you're the kind of person who isn't happy unless his gaming rig generates the highest raw benchmark numbers, the PS4 seems to be your console of choice for the time being (though, really, a high-end PC still wins out on this score). If you're the kind of person who values actual gameplay, though, choose your next console based on the games. You can feel secure in the knowledge that, graphically, there doesn't seem to be much practical, noticeable difference in performance.

Promoted Comments

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Kyle, you seem to be someone who usually goes out of his way to get the details right and makes sure to point out flaws that other people gloss over, so I'm not sure where this article came from. You seem to be trying really hard to ignore a pretty large performance difference between the two consoles.

You say that this being an early release means the importance isn't that great since things will be better optimized later on, but I don't get that logic. With both consoles having extremely similar hardware, I think it's safe to assume that both consoles will have similar optimization headroom later on in their life. If that's the case, the PS4 would still be getting much better performance in the late stages of the console's life.

Also, the whole "720p to 1080p only makes a big difference if you are really close" argument is just complete rubbish. A LOT of people sit within five feet of their TVs, especially while gaming. Lower resolutions can really take you out of the game, especially in slower paced areas when you have time to really look at things.

I'm not trying to ignore it... I point out multiple times in this article that it is real and is there. I'm just stating my personal opinion that, practically, it does not make a huge difference in my subjective appreciation of the two scenes, based on what I have seen.

Whether both consoles get "similar optimization" depends largely on which console is more popular, I think. The Xbox 360 got a lot more attention from a lot of developers because of its sales lead in the West, and therefore was able to stretch a lot farther on objectively worse hardware for quite a while when compared to the PS3 (Sony's hardware advantage has arguably won out in the highest end late-generation games, though)

Also, I consider 5 feet to be "really close" for gaming. A quick, informal Twitter survey I just did found almost everyone who answered was 8-10 feet or more from their TVs when playing. Not at all scientific or anything, but it suggests 5 feet isn't the norm. In any case, the fact that some people play close doesn't mean the "720p to 1080p only makes a big difference if you are really close" argument is invalid, just that it doesn't apply to those people.

25 years later, and people are still arguing which console has more sprites, more colors and more parallax scrolling. Except now it requires detailed side-by-side comparisons, professional video captures and the scrutinizing of individual pixels on a 27-inch monitor.

I almost miss the days when the technical differences actually resulted in gameplay and playability differences (SF2, anyone?). At least then, there was a point to that discussion.

Kyle Orland
Kyle is the Senior Gaming Editor at Ars Technica, specializing in video game hardware and software. He has journalism and computer science degrees from University of Maryland. He is based in the Washington, DC area. Emailkyle.orland@arstechnica.com//Twitter@KyleOrl

At any rate, this article strikes me as terribly disingenuous and lacking on several key points.

What's good for video resolution is NOT necessarily good for video game resolution. It really, really bothers me that that chart keeps getting passed around willy-nilly with minimal understanding of the context behind it. Yes, you can't discern individual pixels once they go beneath your angular resolution, but that's not the only problem you have to deal with for video games. You also have to deal with aliasing, especially temporal aliasing which is a HUGE problem in games (to the point where Nvidia created their proprietary TXAA solution to completely eliminate temporal aliasing, albiet at the cost of a blurrier overall image).

Even at resolutions far in excess of the point at which you can no longer discern individual pixels, temporal aliasing can remain very, very obvious. The only way to get around this is with better and more sampling. The whole reason CG movies can look so clean is because they have literally limitless amounts of processing time to make dozens, hundreds, thousands, or even millions of samples for every pixel. Imagine 100x anti-aliasing in your video games! We're not even close to that point yet, and if trends continue, we likely never will reach that point as devs continue to decide to use power for rendering more effects instead of increasing image quality.

On a related (though off-topic) note, this is similar to why motion blur, particularly per-object motion blur, is necessary in video games, especially video games with a less-than-stellar frame rate. Temporal resolution is as much a thing as spatial resolution, and even 60 FPS isn't cutting it for really fast (screen-clearing) movements. There are many games where you will have a character leap from one frame to the next with nothing in-between - that's all lost visual information, and it's another thing that differentiates video from video games. There's good reasons why so many games use trails to denote character movements (in the absence of good per-object motion blur). I'm honestly shocked and appalled that so many people find 30 FPS perfectly acceptable in this day and age, but then again, many of these people are in the "blissfully ignorant" category, so perhaps once we finally improve our standards, they, too, will understand the glory of gaming at 60 FPS with proper per-object motion blur and strong anti-aliasing.

I'd also point to Kyle's own commentary about the importance of resolution to the Oculus Rift.

I guess he doesn't expect we'll have VR on consoles?

We will - and you'll want all the pixels you can get then, I can assure you

It's technically embarrassing to advise people not to care about this stuff when shelling out for systems that are to last 6+ years and that will see all kinds of new tech and demands in their lifetime.

Then you've missed my point, which is that the resolution a game runs at is not decided by the limitations of the hardware.

Then your point is wrong, because resolution *is* determined by limitation of the hardware. You think DICE wouldn't run BF4 on both consoles at 1080P with all the bells and whistles if they could? If it's an "artistic decision" then why aren't they running at the same resolution on both consoles? The simple fact is, they chose to run it at as high a resolution as they could afford on both consoles, because the higher the resolution the better the game looks.

If they wanted to run it at the highest possible resolution, they could do that easily. It's just a switch in a config file. Of course, by doing so they sacrifice some frame rate or some visual effects. They made the artistic choice to sacrifice resolution in order to keep a high level of effects and frame rate.

That situation and that choice exists no matter how powerful the hardware is.

Both consoles can do 1080p easily. Dice specifically chose not to do it. That's my point.

I dont really give a shit about these minor differences. If you actually have to sit there and look for a difference then who cares.... I was going to buy a PS4 first but I changed my mind, because it doesnt read cd or cant play mp3 music, and the launch games suck compared to the xbox ones... So you can sit there all day looking at minor differences or play the damn games.For gaming at launch The xbox one wins hands down, I lvoe sony and been with them for years but only hardcore fanboys would disagree. So in my eyes the Xbox One is going to be the better system at launch anyways..

That situation and that choice exists no matter how powerful the hardware is.

And if that choice was "artistic" then it would be justified by them running the same resolution on both consoles. They're not. They're running at a higher resolution with the same effects on the PS4 because it looks better, and they can, because the PS4 hardware has more GPU horsepower and isn't as limiting to the developers desires.

Then you've missed my point, which is that the resolution a game runs at is not decided by the limitations of the hardware.

Then your point is wrong, because resolution *is* determined by limitation of the hardware. You think DICE wouldn't run BF4 on both consoles at 1080P with all the bells and whistles if they could? If it's an "artistic decision" then why aren't they running at the same resolution on both consoles? The simple fact is, they chose to run it at as high a resolution as they could afford on both consoles, because the higher the resolution the better the game looks.

If they wanted to run it at the highest possible resolution, they could do that easily. It's just a switch in a config file. Of course, by doing so they sacrifice some frame rate or some visual effects. They made the artistic choice to sacrifice resolution in order to keep a high level of effects and frame rate.

That situation and that choice exists no matter how powerful the hardware is.

Both consoles can do 1080p easily. Dice specifically chose not to do it. That's my point.

Sacrificing resolution to cope with a technical limitation while simultaneously removing dynamic range and effects in a game which is supposed to be photorealistic is still a technical, not an artistic manoeuvre, and it sounds as though you are trying to confuse the issue by continuing to use the word "artistic" in the wrong context.

A PC video card from a year or two ago can comfortably beat them it seems. I get that the consoles are cheap but you do pay a premium on games compared to the PC version.

How much did you pay for that video card from a year or two ago? How much did you pay for the rest of the machine to run it? How much do you pay for the case to make it look acceptable in your living room?

FWIW, there is a lot of unknowns still. The hardware between the two units seems fairly close with the advantage always edging in the PS4's favor. Is this because the xbox one tools are just that far behind? Is this because the xbox os and other features are taking up more resources vs the PS4. We just don't know and perhaps the performance gap can be at least partly narrowed by MS as things move forward.

Its because the PS4 has a distinctly more powerful GPU. That is a known fact. Microsoft had to stick that 32MB of ESRAM on the main SoC to make up for going DDR3 instead of GDDR5 for main memory. They had less space for the GPU. Sony had more. Hence developers can run their games with the with an equivalent amount of effects at a higher resolution on the PS4. Its not going to change.

Honestly, sometimes I wish developers would simply expose the graphics sliders and resolution options in the console versions of their games. That way, everyone could pick the balance that they prefer, and people wouldn't have reason to complain about things like "1080p" and "60fps", etc.

That situation and that choice exists no matter how powerful the hardware is.

And if that choice was "artistic" then it would be justified by them running the same resolution on both consoles...

Not at all. Resolution is just one more dial they can tweak to get the best result. For one platform, it could be higher, for another, lower.

And the best result is, get this, "limited" by the hardware.

No. It isn't. The hardware is what it is. Xbox One could be ten times as powerful; there's still going to be a tradeoff between resolution and effects quality, and the developers could still choose to sacrifice resolution to get better effects. Ditto for PS4 or PC.

No. It isn't. The hardware is what it is. Xbox One could be ten times as powerful; there's still going to be a tradeoff between resolution and effects quality, and the developers could still choose to sacrifice resolution to get better effects. Ditto for PS4 or PC.[/quote]

If it was 10 times as powerful, then it would have commensurately higher limits, and if they chose to run at anything higher than 720p, then we cannot argue that running at that resolution is an "artistic choice", rather a choice forced upon them by trying to make the game they want within the limits of the hardware they have to work with.

Then you've missed my point, which is that the resolution a game runs at is not decided by the limitations of the hardware.

Then your point is wrong, because resolution *is* determined by limitation of the hardware. You think DICE wouldn't run BF4 on both consoles at 1080P with all the bells and whistles if they could? If it's an "artistic decision" then why aren't they running at the same resolution on both consoles? The simple fact is, they chose to run it at as high a resolution as they could afford on both consoles, because the higher the resolution the better the game looks.

If they wanted to run it at the highest possible resolution, they could do that easily. It's just a switch in a config file. Of course, by doing so they sacrifice some frame rate or some visual effects. They made the artistic choice to sacrifice resolution in order to keep a high level of effects and frame rate.

That situation and that choice exists no matter how powerful the hardware is.

Both consoles can do 1080p easily. Dice specifically chose not to do it. That's my point.

Sacrificing resolution to cope with a technical limitation while simultaneously removing dynamic range and effects in a game which is supposed to be photorealistic is still a technical, not an artistic manoeuvre, and it sounds as though you are trying to confuse the issue by continuing to use the word "artistic" in the wrong context.

It's artistic because they could easily choose to sacrifice effects instead. Shorter draw distance, fewer shader passes, fewer lights, coarser surface maps, fewer polygons. It's a balance, and it is very much an artistic decision as to what effects to keep and what to remove. Or whether to keep everything and have it run at 25fps. Or whether to drop down to 1440p instead of 900p or 720p.

It is absolutely an artistic decision. The hardware will do whatever the developers tell it to do.

If it was 10 times as powerful, then it would have commensurately higher limits, and if they chose to run at anything higher than 720p, then we cannot argue that running at that resolution is an "artistic choice", rather a choice forced upon them by trying to make the game they want within the limits of the hardware they have to work with.

Everything is an artistic choice. If the hardware were ten times more powerful, it would still be their choice whether to go with 1080p or stay at 720p and just pack in a million more lights and hundreds more shader passes or something.

Then you've missed my point, which is that the resolution a game runs at is not decided by the limitations of the hardware.

Then your point is wrong, because resolution *is* determined by limitation of the hardware. You think DICE wouldn't run BF4 on both consoles at 1080P with all the bells and whistles if they could? If it's an "artistic decision" then why aren't they running at the same resolution on both consoles? The simple fact is, they chose to run it at as high a resolution as they could afford on both consoles, because the higher the resolution the better the game looks.

If they wanted to run it at the highest possible resolution, they could do that easily. It's just a switch in a config file. Of course, by doing so they sacrifice some frame rate or some visual effects. They made the artistic choice to sacrifice resolution in order to keep a high level of effects and frame rate.

That situation and that choice exists no matter how powerful the hardware is.

Both consoles can do 1080p easily. Dice specifically chose not to do it. That's my point.

Sacrificing resolution to cope with a technical limitation while simultaneously removing dynamic range and effects in a game which is supposed to be photorealistic is still a technical, not an artistic manoeuvre, and it sounds as though you are trying to confuse the issue by continuing to use the word "artistic" in the wrong context.

It's artistic because they could easily choose to sacrifice effects instead. Shorter draw distance, fewer shader passes, fewer lights, coarser surface maps, fewer polygons. It's a balance, and it is very much an artistic decision as to what effects to keep and what to remove. Or whether to keep everything and have it run at 25fps. Or whether to drop down to 1440p instead of 900p or 720p.

It is absolutely an artistic decision. The hardware will do whatever the developers tell it to do.

So what you are saying is that because XBox One is unable to handle both high resolution and effect, they made an artistic decision to reduce the resolutions. While at the same time PS4 was capable of handling both high resolution and effects, they made an artistic decision to keep the high resolution.

Then you've missed my point, which is that the resolution a game runs at is not decided by the limitations of the hardware.

Then your point is wrong, because resolution *is* determined by limitation of the hardware. You think DICE wouldn't run BF4 on both consoles at 1080P with all the bells and whistles if they could? If it's an "artistic decision" then why aren't they running at the same resolution on both consoles? The simple fact is, they chose to run it at as high a resolution as they could afford on both consoles, because the higher the resolution the better the game looks.

If they wanted to run it at the highest possible resolution, they could do that easily. It's just a switch in a config file. Of course, by doing so they sacrifice some frame rate or some visual effects. They made the artistic choice to sacrifice resolution in order to keep a high level of effects and frame rate.

That situation and that choice exists no matter how powerful the hardware is.

Both consoles can do 1080p easily. Dice specifically chose not to do it. That's my point.

Sacrificing resolution to cope with a technical limitation while simultaneously removing dynamic range and effects in a game which is supposed to be photorealistic is still a technical, not an artistic manoeuvre, and it sounds as though you are trying to confuse the issue by continuing to use the word "artistic" in the wrong context.

It's artistic because they could easily choose to sacrifice effects instead. Shorter draw distance, fewer shader passes, fewer lights, coarser surface maps, fewer polygons. It's a balance, and it is very much an artistic decision as to what effects to keep and what to remove. Or whether to keep everything and have it run at 25fps. Or whether to drop down to 1440p instead of 900p or 720p.

It is absolutely an artistic decision. The hardware will do whatever the developers tell it to do.

So what you are saying is that because XBox One is unable to handle both high resolution and effect, they made an artistic decision to reduce the resolutions. While at the same time PS4 was capable of handling both high resolution and effects, they made an artistic decision to keep the high resolution.

I'm afraid it's become obvious that, in terms of having pointless arguments on the internet, Mr Dunn is a couple of Alans short of a Turing.

Then you've missed my point, which is that the resolution a game runs at is not decided by the limitations of the hardware.

Then your point is wrong, because resolution *is* determined by limitation of the hardware. You think DICE wouldn't run BF4 on both consoles at 1080P with all the bells and whistles if they could? If it's an "artistic decision" then why aren't they running at the same resolution on both consoles? The simple fact is, they chose to run it at as high a resolution as they could afford on both consoles, because the higher the resolution the better the game looks.

If they wanted to run it at the highest possible resolution, they could do that easily. It's just a switch in a config file. Of course, by doing so they sacrifice some frame rate or some visual effects. They made the artistic choice to sacrifice resolution in order to keep a high level of effects and frame rate.

That situation and that choice exists no matter how powerful the hardware is.

Both consoles can do 1080p easily. Dice specifically chose not to do it. That's my point.

Sacrificing resolution to cope with a technical limitation while simultaneously removing dynamic range and effects in a game which is supposed to be photorealistic is still a technical, not an artistic manoeuvre, and it sounds as though you are trying to confuse the issue by continuing to use the word "artistic" in the wrong context.

It's artistic because they could easily choose to sacrifice effects instead. Shorter draw distance, fewer shader passes, fewer lights, coarser surface maps, fewer polygons. It's a balance, and it is very much an artistic decision as to what effects to keep and what to remove. Or whether to keep everything and have it run at 25fps. Or whether to drop down to 1440p instead of 900p or 720p.

It is absolutely an artistic decision. The hardware will do whatever the developers tell it to do.

So what you are saying is that because XBox One is unable to handle both high resolution and effect, they made an artistic decision to reduce the resolutions. While at the same time PS4 was capable of handling both high resolution and effects, they made an artistic decision to keep the high resolution.

I'm saying that if they thought resolution was that important, and they had to hit 1080p no matter what, they could have done that, no problem. The reason it doesn't run at 1080p on either console is because they felt resolution wasn't that important for their game.

I'm saying that if they thought resolution was that important, and they had to hit 1080p no matter what, they could have done that, no problem. The reason it doesn't run at 1080p on either console is because they felt resolution wasn't that important for their game.

There are no technical choices with technical things, only artistic choices!

Yes. Obviously there are business and technical considerations that go into making those choices. But games are works of art, like any other. The medium you release them on is as much an artistic decision as anything else.

I'm saying that if they thought resolution was that important, and they had to hit 1080p no matter what, they could have done that, no problem. The reason it doesn't run at 1080p on either console is because they felt resolution wasn't that important for their game.

And why do you think that they decided to keep 1080p in PS4?

Uh, BF4 runs 900p on PS4.

But the answer to your inferred question is because they wanted to. You said it yourself: they decided.

I'm saying that if they thought resolution was that important, and they had to hit 1080p no matter what, they could have done that, no problem. The reason it doesn't run at 1080p on either console is because they felt resolution wasn't that important for their game.

And why do you think that they decided to keep 1080p in PS4?

Uh, BF4 runs 900p on PS4.

But in answer to your inferred question, because they wanted to.

Excellent. So they decided XBox One is going to get 720p while PS4 is going to get 900p; for no other reason than that they felt like it?

Yes. Obviously there are business and technical considerations that go into making those choices. But games are works of art, like any other. The medium you release them on is as much an artistic decision as anything else.

This is the most ridiculous statement I've seen in several days, and I frequent NeoGAF. Your madness is truly inspiring.

I'm saying that if they thought resolution was that important, and they had to hit 1080p no matter what, they could have done that, no problem. The reason it doesn't run at 1080p on either console is because they felt resolution wasn't that important for their game.

And why do you think that they decided to keep 1080p in PS4?

Uh, BF4 runs 900p on PS4.

But in answer to your inferred question, because they wanted to.

Excellent. So they decided XBox One is going to get 720p while PS4 is going to get 900p; for no other reason than that they felt like it?

Certainly. They felt it would best deliver the results they were looking for.

There is an update to the chart posted in original articles which does cover your setup:

Gee, i think i am actually poor. Nice to have such equipment.

But i wonder, if i read the charts correctly that if you sit far away the resolutions are irrelevant. Is that correct?It makes me wonder if i go in to the movies and watch something on a big screen if i actually see a high resolution or something more standard besides 3D.

I only have a 720p tv and i don't play anymore regularly on the tv, but even watching a movie sitting close you will see distortions from either the tv or the video. Not sure which one really.This is from a blue rays movie and not streaming tp be precise.

I'm saying that if they thought resolution was that important, and they had to hit 1080p no matter what, they could have done that, no problem. The reason it doesn't run at 1080p on either console is because they felt resolution wasn't that important for their game.

And why do you think that they decided to keep 1080p in PS4?

Uh, BF4 runs 900p on PS4.

But in answer to your inferred question, because they wanted to.

Excellent. So they decided XBox One is going to get 720p while PS4 is going to get 900p; for no other reason than that they felt like it?

Certainly. They felt it would best deliver the results they were looking for.

Yes. Obviously there are business and technical considerations that go into making those choices. But games are works of art, like any other. The medium you release them on is as much an artistic decision as anything else.

This is the most ridiculous statement I've seen in several days, and I frequent NeoGAF. Your madness is truly inspiring.

Plenty of developers out there have said words to the effect of "I have wanted to make this game for a long time, but I couldn't until console x arrived." They chose not to make their game because the hardware wasn't ready or wasn't capable (eg. Dance Central). That's an artistic choice. Or they choose to release on PC because they have a mouse and keyboard, instead of releasing on console (eg. Diablo II). Artistic choice. Or they choose to run at 720p instead of 1080p because it makes their game look better (eg. Battlefield 4). Artistic choice.

"While the PS4 appears to be running more games at a higher resolution than the XboxOne, both systems are displaying their games solidly within the HD spec. Whether or not you can see this difference in resolution will depend on the size of your TV and your seating distance. Naturally, if your TV is 720p already, this is a moot point for you. Otherwise, check out this handy chart to learn what research has shown to be the optimal seating distance in relation to screen size and resolution."

(removed vaporware points like "they might optimize it better in the future")

They chose not to make their game because the hardware wasn't ready or wasn't capable (eg. Dance Central). That's an artistic choice. Or they choose to release on PC because they have a mouse and keyboard, instead of releasing on console (eg. Diablo II). Artistic choice. Or they choose to run at 720p instead of 1080p because it makes their game look better (eg. Battlefield 4). Artistic choice.

You do realize that your first two examples are THE VERY DEFINITION OF A TECHNICAL CHOICE, right?

They chose not to make their game because the hardware wasn't ready or wasn't capable (eg. Dance Central). That's an artistic choice. Or they choose to release on PC because they have a mouse and keyboard, instead of releasing on console (eg. Diablo II). Artistic choice. Or they choose to run at 720p instead of 1080p because it makes their game look better (eg. Battlefield 4). Artistic choice.

You do realize that your first two examples are THE VERY DEFINITION OF A TECHNICAL CHOICE, right?

I completely agree with the your assessment of the resolutions, except in one case: the HUD overlay.

A fully rendered scene will look essentially the same, due to scaling and skilled antialiasing. The HUD overlay tends to be a fixed number of pixels, and do not scale.

A 72 pixel square hotkey/button/status icon takes up 10% of a 720p screen's height. It takes only 7% on 1080p. This gives the developer the option to make the icon artwork bigger (to 108px square) and provide more detail with the design, or leave it alone, giving more free space that the HUD doesn't interfere with.

See for yourselves... Fire up any contemporary MMO that supports multiple hot bars. See how much of the beautifully rendered scene changes with three bottom bars and two right bars under 720p and 1080p

There is an update to the chart posted in original articles which does cover your setup:

Gee, i think i am actually poor. Nice to have such equipment.

But i wonder, if i read the charts correctly that if you sit far away the resolutions are irrelevant. Is that correct?It makes me wonder if i go in to the movies and watch something on a big screen if i actually see a high resolution or something more standard besides 3D.

I only have a 720p tv and i don't play anymore regularly on the tv, but even watching a movie sitting close you will see distortions from either the tv or the video. Not sure which one really.This is from a blue rays movie and not streaming tp be precise.

The distortions could be bad video encoding (not as uncommon as you might hope). Poor (or overly aggressive) video compressors tend to do poorly on low contrast scenes. In the context of movies, this usually means dark scenes with motion (like panning the view across a dark swamp). The compressors are trying to compress, so when there's a thousand shades of very-close-together dark gray, it just gets broken up into like 4 shades of dark grey that tries to approximate the finer gradient that it replaces. That usually ends up looking blocky and squared.

Alternatively, you could have errors...either in the video files or the reading of the video file. Whenever a video decompressor can't read data, the errors usually manifest as little blocks of previous footage that get "stuck" (and look out of place) before being overwritten by subsequent, error-free video. Kind of like "sticky" computer static that lasts a second and then disappears. If you encounter too many data or read errors at once, the video will usually skip.