Op-ed: Why I‘m not too worked up about the next-gen console resolution wars

I'm curious about the power consumption of the two systems. I couldn't make out much of a difference (other than colors, which actually seemed more vivid on the XB frames), so if the PS4 is wasting a ton of juice to squeeze out those extra pixels, it might even be a detriment.

Not that it matters. My mind is already made up on this console generation.

The power differences will be negligible. Certainly not big enough to base a purchase on it.

It may not matter much now, but there is no way having better graphics performance is a negative, and it's one more positive for the PS4 compared to Xbox One - as if it needed any more to convince gamers.

Can you elaborate a bit more? Your comment sounds a bit like fanboy spewage.

From a technical perspective this is all pretty neat. I think it's interesting that Sony has gone the straightforward-and-simple route this generation, whereas last gen the PS3 was complex and difficult to optimize for, and the xbox was the simple one.

On the other hand, as a gamer... I want an xbox. I like the controller, I like the titles, and it's what I'm used to. Is that not how most people are making their purchasing decision? It doesn't seem like the differences are drastic enough to warrant jumping ship.

Imagine a Skyrim remake. Same basic graphics, but when you walk into Whiterun the first time there's a couple hundred NPCs walking around. Imagine the taven is so full at night you have to push your way through the crowd. Vast herds of mammoths. Attacking a fort with 50 Stormcloaks. Carts and wagons constantly traveling the trade routes. A detailed economy running in the background allowing your own trading activities.

They have the resolution they need. Start populating the game worlds more.

That is exactly the sort of thing you can do with Xbox One's cloud processing feature. Calculating behaviors of hundreds or thousands of individual objects driven by detailed AI or physics, deep economies, etc.

It would be interesting to see an article that investigates what a $300-$400 upgrade to the typical gaming PC would yield in terms of graphical performance.

In terms of hardware it seems like the new consoles aren't such a great deal. Of course, game selection is what ultimately matters most, though at least the big multiplatform games seem to be mostly available on PC as well.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Poor filtrate is an "artistic" decision? Not even slightly, the "artistic" side is constrained by the fact that, if you're going to try to go photorealistic, you need to use even more smoke and even shinier mirrors. 3D graphics is the art of pretending to show more than you actually are.

Stockholm isn't just a city in Sweden...

The artistic decision is whether to go with 1080p or 900p or 720p. You know what the fill-rate is going in. You know what the capabilities of your engine are. The decision is how best to allocate that fill-rate to achieve the most eye-pleasing result. In some cases, higher resolution does not yield the best result.

Again, they could probably run BF4 on Xbox One at 1080p. It probably wouldn't even look that bad. But they decided they wanted more effects, so they turned the resolution down.

No matter how you spin it dropping resolution and trying to smear it all together with DoF/AA/BLOOOOOOOM isn't an artistic decision, it's a technical compromise. If you were making the point that photorealism != artistry, then fine, but I'm not sure that's what you're saying. As I mentioned above, more power available = less effort spent having to fake it and more room for your original ideas.

Put it this way: it doesn't matter how powerful the box is. The console could be ten times more powerful than the fastest 3x GTX Titan/Core i7/32GB monster rig. Running the game at 720p would still allow for more effects than running at 1080p, and developers would still have to make compromises. Dropping to 720p is one such compromise a developer could make, and on a console where you are seated ten feet away, that would make sense.

On a PC where you are seated one foot away, high resolution is a lot more important than on a console where you are 10 feet away.

I feel like this is an oversight. I don't care a whole lot about the resolution itself - more what it implies. Microsoft talked up balance and whatnot, but the resolution and framerate differences make it very clear the hardware is significantly different. The resolution maybe the only difference now, but later that may mean better effects, antialiasing, etc at the same resolution (ie when both do 720p or 1080p in the same game, one will have much more power per pixel). Let alone first party games.

Yes, optimizing around the eSRAM may be a factor, but the ROPs, TMUs, compute units all speak for themselves.

Also, the whole "720p to 1080p only makes a big difference if you are really close" argument is just complete rubbish. A LOT of people sit within five feet of their TVs, especially while gaming. Lower resolutions can really take you out of the game, especially in slower paced areas when you have time to really look at things.

Yes, but also sometimes no.

That graph has always been flawed because it doesn't consider the input. The difference between 720p and 1080p video is largely unnoticable on a 1080p display but the difference in gaming at those resolutions is absolutely enormous. Video is antialiased by nature so that you'll never see much pixel crawling while vector based video games follow a completely different set of rules.

All those pixel thin wires and structures we see in the comparison videos show this quite clearly. There are far more pixel gaps in the Xbox videos than the PS4 ones. If this were live action TV then there would be no pixel gaps.

I agree with your sentiment Kyle, but what the current generation has taught me is that I really hate the shimmering you from aliasing and anti-aliasing artifacts that are notably more noticeable on 720p images than on 1080p images. 1080p still suffers from aliasing, but anti-aliasing is just so much more effective at that resolution!

While I know that the graph was intended to address television viewing distances and screen sizes, I can only assume that it would apply equally well to understanding what is "worth it" when considering optimal PC monitor resolutions. And what that tells me is that Ultra HD would, in fact, make a significant difference for the PC gaming experience (contrary to what I've heard that the difference would not be noticeable).

I've drug a similar graph into numerous debates about resolution, as many have argued that 4k is only important on a really huge screen. It's not just for PC gamers, but anyone who has a home theater set up that comes close to the THX recommended viewing distance (screen size in inches / 0.84). Of course the charts are usually for average visual acuity (20/20) so the experiences vary a bit between people.

I wouldn't dismiss those who appreciate the difference, but it's worth acknowledging that many people just don't care. Many people sit 12 feet back from a 40" tv and are just fine.

It's also a launch game, maybe there's more room to optimize on the XBox than on the PS4, and in a year they'll be churning out games at the same resolution, or maybe optimization gains will be parallel and PS4 will have an edge.

The tenacious need to justify your purchase this go around has reached new heights. I've never seen so much vitriol over something that, let's be honest here, doesn't /really/ matter.

I think the big stories in all of this are: • both consoles are woefully mediocre• both launch lineups are surprisingly meh (save, for me, dead rising and the panzer dragoon)• yes, I have both on preorder form amazon• yes, I almost want to cancel both and get a gtx780 & a bigger SSD for my gaming PC

Having all 3, I can estimate this first year will be mostly PC on multiplatform titles, and ive essentially payed 560 to play dead rising 3, and 400 for another netflix box in the short term.

I really wish people would stop bringing out that same chart every time the topic of display resolution comes up. It's wrong, plain and simple. It's taking the 20/20 visual acuity standard (one arcminute of resolution), and assuming no one has anything better than that. Personally, I'm 20/13 in my right eye, and 20/15 in my left. There are some people who are as low as 20/10.

The visual acuity standard is measured using printed monochrome text, which by the way is printed at much higher quality than the limit you're supposed to be resolving at. That means a television operating directly at the resolving limit is going to cause aliasing effects, and the meaning of the test cannot be applied to televisions in the manner it is.

Televisions do not have full color pixels. They have a bunch of geometrically separate red, green, and blue pixels, introducing weird chromatic effects at the resolving limit. Again, the test cannot be applied to televisions in the manner it is.

The standard visual acuity test is just one measurement. While the standard test shows typical human acuity at roughly one arcminute, vernier tests show acuity as low as eight arcseconds, and stereo test drop that all the way down to two arcseconds. At two arcseconds, maybe we will stop seeing visual artifacts caused by low resolution displays, but that's a good three orders of magnitude away from where we are.

what drives me absolutely nuts in current gen consoles is the extremely apparent aliasing effects on a large TV. If games are not very anti-aliased (and it seems like, looking at that belt buckle in the video), I am definitely going to be holding off.

While these weren't the final release candidate versions of the game being tested, the resolutions are likely to be consistent in the final games despite an earlier promise by DICE to target "equal performance" on both consoles.

Well if the FPS is similar then that might indeed be "equal performance" Just not at the exact same thing. This whole thing is silly though, the PS3 was in the same hardware position, but the market seemed to disagree that it was the best, while the PS2 was the worst by far but had the biggest share. Graphic fidelity is nice, but even when the differences were noticeable they didn't seem to dominate sales. So I would agree I see nothing to get worked up over here really.

Sure, but isn't it likely that if BF4 didn't have a console version and had remained PC only (like the older Battlefields) that it would have more features than it does now?

That seems likely, but just being an exclusive doesn't necessarily mean they would bother putting resources into fleshing it out any better or have the funding to do so. You're talking about a game from a big publisher and the only thing they care about right now is the money so I wouldn't expect DICE to get anywhere near the same budget for a PC only game.

Now if they did then sure, I'm just not entirely sure what features they would add. Battlefield always made CoD look like an arcade game, but it's always been a bit of one as well with a bigger sandbox.

To me, this speaks more to the life cycle of the two consoles. I expect that graphical performance will improve for both consoles as developers get better at working with them over the course of their lives, but if this scales relatively similarly across both platforms, it is still bad news in the long run for the Xbox One. If the Xbox One can't perform as well as the PS4, this means that it is likely that it will need to be replaced sooner.

Most likely this will become more obvious when 4K HD televisions that are 60"+ start becoming the norm within peoples' living rooms. Given how quickly current HD took over, I can see this happening soon than expected. Having seen the dramatic difference that viewing a 4K HD television provides over a normal HD television, I am sure this will matter to people eventually.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Poor filtrate is an "artistic" decision? Not even slightly, the "artistic" side is constrained by the fact that, if you're going to try to go photorealistic, you need to use even more smoke and even shinier mirrors. 3D graphics is the art of pretending to show more than you actually are.

Stockholm isn't just a city in Sweden...

The artistic decision is whether to go with 1080p or 900p or 720p. You know what the fill-rate is going in. You know what the capabilities of your engine are. The decision is how best to allocate that fill-rate to achieve the most eye-pleasing result. In some cases, higher resolution does not yield the best result.

Again, they could probably run BF4 on Xbox One at 1080p. It probably wouldn't even look that bad. But they decided they wanted more effects, so they turned the resolution down.

No matter how you spin it dropping resolution and trying to smear it all together with DoF/AA/BLOOOOOOOM isn't an artistic decision, it's a technical compromise. If you were making the point that photorealism != artistry, then fine, but I'm not sure that's what you're saying. As I mentioned above, more power available = less effort spent having to fake it and more room for your original ideas.

Put it this way: it doesn't matter how powerful the box is. The console could be ten times more powerful than the fastest 3x GTX Titan/Core i7/32GB monster rig. Running the game at 720p would still allow for more effects than running at 1080p, and developers would still have to make compromises. Dropping to 720p is one such compromise a developer could make, and on a console where you are seated ten feet away, that would make sense.

Point taken, but you're going to hit issues with non-native LCD resolutions straight away with that approach, so your very expensive pixels are still just going to get smeared together.

If anything, the Xbox One presentation overall was actually slightly sharper than the PS4 despite the reduction in pixel count. Microsoft claims their scaler on the One is better than that on the 360, and as someone who runs a 360 into a Sony Bravia at 720p because the (much newer) TV does a better job scaling to 1080p, I believe this.

That sharpness you see in the XBO version? That's actually from reduced RGB range in the game which will be fixed shortly, it may look poppier but it's actually crushing blacks (dark grey becomes total black), so despite looking punchier it's actually only removing detail.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Poor filtrate is an "artistic" decision? Not even slightly, the "artistic" side is constrained by the fact that, if you're going to try to go photorealistic, you need to use even more smoke and even shinier mirrors. 3D graphics is the art of pretending to show more than you actually are.

Stockholm isn't just a city in Sweden...

The artistic decision is whether to go with 1080p or 900p or 720p. You know what the fill-rate is going in. You know what the capabilities of your engine are. The decision is how best to allocate that fill-rate to achieve the most eye-pleasing result. In some cases, higher resolution does not yield the best result.

Again, they could probably run BF4 on Xbox One at 1080p. It probably wouldn't even look that bad. But they decided they wanted more effects, so they turned the resolution down.

No matter how you spin it dropping resolution and trying to smear it all together with DoF/AA/BLOOOOOOOM isn't an artistic decision, it's a technical compromise. If you were making the point that photorealism != artistry, then fine, but I'm not sure that's what you're saying. As I mentioned above, more power available = less effort spent having to fake it and more room for your original ideas.

Put it this way: it doesn't matter how powerful the box is. The console could be ten times more powerful than the fastest 3x GTX Titan/Core i7/32GB monster rig. Running the game at 720p would still allow for more effects than running at 1080p, and developers would still have to make compromises. Dropping to 720p is one such compromise a developer could make, and on a console where you are seated ten feet away, that would make sense.

Point taken, but you're going to hit issues with non-native LCD resolutions straight away with that approach, so your very expensive pixels are still just going to get smeared together.

Keep in mind, these consoles will output EVERYTHING at 1080p, its just the internal rendering that gets scaled before output.

Imagine a Skyrim remake. Same basic graphics, but when you walk into Whiterun the first time there's a couple hundred NPCs walking around. Imagine the taven is so full at night you have to push your way through the crowd. Vast herds of mammoths. Attacking a fort with 50 Stormcloaks. Carts and wagons constantly traveling the trade routes. A detailed economy running in the background allowing your own trading activities.

They have the resolution they need. Start populating the game worlds more.

That is exactly the sort of thing you can do with Xbox One's cloud processing feature. Calculating behaviors of hundreds or thousands of individual objects driven by detailed AI or physics, deep economies, etc.

I'll believe it when I see it.

See also: SimCity.

We'll see.

But FWIW, Microsoft already has the infrastructure, developer tools and support to make it work, and it's free to use for Xbox One devs

I recently had an opportunity to trial a 4k TV. I would have to disagree with the chart on that one. It seemed to me that at any distance, the image was so dense and so rich that I was honestly surprised. Perhaps some of the experience was due to the vivid color saturation the unit gave, but things were so danged real looking, at any distance. I was watching 4k source material too, so I am sure that was a big part. It made me want to lay out the cash to get one though where I wouldn't have considered it before as I thought of it as a straight up resolution increase. Its not.

Consciously, you notice the resolution the most when you are trying to read text, and this would provide a good first-level test of distance vs size and resolution, and I'm predicting you'd get a much different chart than the one everyone bandies about. But I think your unconscious mind notices resolution in images better than your conscious mind. You might have a hard time picking out specific differences, but at a certain level of detail, the unconscious mind just 'clicks' and you can start imagining that it is real. It is pretty mind boggling when this happens.

I also saw a 4K TV demo and the realism was just incredible, even from 20 feet away. They are actually getting affordable (you can get a 50" one for $1500).

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Poor filtrate is an "artistic" decision? Not even slightly, the "artistic" side is constrained by the fact that, if you're going to try to go photorealistic, you need to use even more smoke and even shinier mirrors. 3D graphics is the art of pretending to show more than you actually are.

Stockholm isn't just a city in Sweden...

The artistic decision is whether to go with 1080p or 900p or 720p. You know what the fill-rate is going in. You know what the capabilities of your engine are. The decision is how best to allocate that fill-rate to achieve the most eye-pleasing result. In some cases, higher resolution does not yield the best result.

Again, they could probably run BF4 on Xbox One at 1080p. It probably wouldn't even look that bad. But they decided they wanted more effects, so they turned the resolution down.

No matter how you spin it dropping resolution and trying to smear it all together with DoF/AA/BLOOOOOOOM isn't an artistic decision, it's a technical compromise. If you were making the point that photorealism != artistry, then fine, but I'm not sure that's what you're saying. As I mentioned above, more power available = less effort spent having to fake it and more room for your original ideas.

Put it this way: it doesn't matter how powerful the box is. The console could be ten times more powerful than the fastest 3x GTX Titan/Core i7/32GB monster rig. Running the game at 720p would still allow for more effects than running at 1080p, and developers would still have to make compromises. Dropping to 720p is one such compromise a developer could make, and on a console where you are seated ten feet away, that would make sense.

Point taken, but you're going to hit issues with non-native LCD resolutions straight away with that approach, so your very expensive pixels are still just going to get smeared together.

I always see the argument that PCs are always so gosh durnd expensive... it doesn't take 3x SLI and Core i7s or 32GB of RAM.

Oops... here's a gaming rig that costs $600 and will enable 1080p gaming with almost all current games. And the 2GB 7870 can now be had for about $10-15 more which makes that build even better.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Poor filtrate is an "artistic" decision? Not even slightly, the "artistic" side is constrained by the fact that, if you're going to try to go photorealistic, you need to use even more smoke and even shinier mirrors. 3D graphics is the art of pretending to show more than you actually are.

Stockholm isn't just a city in Sweden...

The artistic decision is whether to go with 1080p or 900p or 720p. You know what the fill-rate is going in. You know what the capabilities of your engine are. The decision is how best to allocate that fill-rate to achieve the most eye-pleasing result. In some cases, higher resolution does not yield the best result.

Again, they could probably run BF4 on Xbox One at 1080p. It probably wouldn't even look that bad. But they decided they wanted more effects, so they turned the resolution down.

No matter how you spin it dropping resolution and trying to smear it all together with DoF/AA/BLOOOOOOOM isn't an artistic decision, it's a technical compromise. If you were making the point that photorealism != artistry, then fine, but I'm not sure that's what you're saying. As I mentioned above, more power available = less effort spent having to fake it and more room for your original ideas.

Put it this way: it doesn't matter how powerful the box is. The console could be ten times more powerful than the fastest 3x GTX Titan/Core i7/32GB monster rig. Running the game at 720p would still allow for more effects than running at 1080p, and developers would still have to make compromises. Dropping to 720p is one such compromise a developer could make, and on a console where you are seated ten feet away, that would make sense.

Point taken, but you're going to hit issues with non-native LCD resolutions straight away with that approach, so your very expensive pixels are still just going to get smeared together.

That's where a high-quality scaler comes in, and that's what allows Xbox One to keep up despite having an inferior GPU.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Xbox One has a better video scaler? Really? That's news to me, since they both use the same Radeon technology from the same generation, only with more cores and ROPs in the case of the PS4.

The original DF article was a hack job full of mistruths, and what they said about the scaler is one of them.

The scaler is integrated into the Radeon GPU, it's the same for both systems. PS4 is not using a software scaler, nor does the Xbox have some kind of magical secret sauce scaler.

As a die-hard master-race member, the Xbox One video looks much better, and if it's because it has reduced processing, crushed blacks, or anything else, it doesn't really matter. They should ship it like that.

I think it will basically come down to bragging rights. Does it sound like a dumb thing to brag about...yes, but give the maturity of a few but vocal players of certain FPS games, any point will suffice.

Point taken, but you're going to hit issues with non-native LCD resolutions straight away with that approach, so your very expensive pixels are still just going to get smeared together.

I always see the argument that PCs are always so gosh durnd expensive... it doesn't take 3x SLI and Core i7s or 32GB of RAM.

Oops... here's a gaming rig that costs $600 and will enable 1080p gaming with almost all current games.[/quote]I wasn't being elitist, I meant "expensive" in terms of drawing passes and numbers of shader instructions, not price of system.

I totally agree though, the price/performance of these consoles is ludicrous given the economies of scale that both MS and Sony have access to. Basically you're paying for 1) high-latency ram and 2) better DRM!

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Xbox One has a better video scaler? Really? That's news to me, since they both use the same Radeon technology from the same generation, only with more cores and ROPs in the case of the PS4.

The original DF article was a hack job full of mistruths, and what they said about the scaler is one of them.

The scaler is integrated into the Radeon GPU, it's the same for both systems. PS4 is not using a software scaler, nor does the Xbox have some kind of magical secret sauce scaler.

One other possible implication is that the Xbone is achieving worse performance despite receiving more developer attention. It sounds like the PS4 version is done, but they admit they are still working on the Xbone one (trying to include ambient occlusion, that we know of). I don't think they were even targeting 1080p for the consoles, if they were would they not still be tweaking the PS4 version? It seems to me that they might have aimed for 720p on both consoles at a certain level of graphical whizz-bang (perhaps "high" setting on the PC) then found that they could crank up the resolution on the PS4 with little to no optimization, but are still working on getting the Xbone to hit that performance target. This is conjecture of course, based on limited information.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

I'm disappointed this clearly self contradictory comment was an editor's pick for this article. At the beginning of this comment, UnnDunn declares that resolution is not a limit of the hardware, while at the end of the comment he explains exactly how it this artistic decision is driven exactly because of limitations of the hardware. You have a point that the console version resolution is somewhat an artistic decision, based on a tradeoff of framerate to graphics, but this is driven entirely by the limitations of the hardware and the scope of what's being rendered, and the technology that is used in the rendering.

Ars editors, please be more skeptical of people who give a bold statement, and then back it up with evidence that proves the exact opposite of that bold statement. (I would ask the same of anyone listening to any political rhetoric.)