A certain corner of the game-focused Internet has been busy counting pixels this week, scrutinizing statements, screenshots, and videos for evidence that either the PlayStation 4 or Xbox One is providing a clearly superior graphical experience at launch. After examining all the available evidence, it seems clear that the PlayStation 4 versions of launch games like Battlefield 4 and Call of Duty: Ghosts enjoy a slight graphical advantage over their Xbox One cousins. It also seems equally clear, to me, that the difference just isn't that big a deal—unless you plan on playing games while looking through a magnifying glass.

The brouhaha really got going on Tuesday, when Digital Foundry posted an analysis of the footage it captured from the PC, PS4, and Xbox One versions of Battlefield 4 during a recent review event. Their capture setup determined that the Xbox One version was running at 1280×720, compared to a 1600×900 resolution for the PS4 version, both at 60 frames per second. While these weren't the final release candidate versions of the game being tested, the resolutions are likely to be consistent in the final games despite an earlier promise by DICE to target "equal performance" on both consoles.

(Members of the PC master race will be happy to know that the Windows version of the game ran at 1920×1080 resolution on "Ultra" settings, besting both consoles handily. The console versions were most comparable to the PC game running at the PC's "High" graphics quality, Digital Foundry said.)

The next volley in the resolution wars came yesterday, when Infinity Ward's Mark Rubin confirmed over Twitter that the Xbox One version of Call of Duty: Ghosts will be running at "1080p upscaled from 720p," compared to a "native 1080p" resolution of the PlayStation 4. These tidbits seem to have a small but loud corner of the Internet convinced that Microsoft's system is overpriced and underpowered, incapable of keeping up with the PlayStation 4.

The reports certainly sound like a big deal, with Sony's system pushing 50 to 100+ percent more native pixels than Microsoft's on identical launch games. Try as I might, though, I can't get too worked up over what seems like an incredibly minor difference in practical graphical output.

Before we continue, you should probably watch the direct Battlefield 4 comparison video Digital Foundry posted (embedded above). Make sure you push it to full screen and full resolution to get an accurate comparison. It should be clear that you can make out differences in the two images if you're looking for them, especially if you get up close to your monitor and focus on the edges of certain objects.

Examining the video a foot or two away from a PC monitor doesn't really mimic the way console gamers play games, though. For that, you're going to have to back up from your monitor at least a couple of long paces. Watch the video again from this farther vantage point. Can you still make out the differences? Even if you can, are they as significant?

Whether a gain in output resolution is noticeable to the human eye depends on three things: the pixel count, the screen size, and, crucially, the distance from the screen. The value of an increase in raw pixels goes down as the screen size gets smaller and as you get farther from the display.

Digital Trends has calculated the distances and screen sizes where various resolutions actually matter. If your living room TV is 10 feet away from your seat, you need a TV a bit larger than 50 inches to notice the difference between 720p and 1080p. If you're 12 feet away, you need a screen larger than 60 inches.

Unless you sit really close to your really large TV, the difference between 720p and 1080p just isn't that noticeable.

Viewing distances aside, we're reaching a point of somewhat diminishing returns when it comes to improving a gaming image just by throwing more pixels at it. Back in the '80s, the jump in resolution between the Atari 2600 and the NES was about the same pure pixel ratio as the jump from 720p to 1080p, but it provided a much more noticeable effect on image quality (even if you discount the NES' wider simultaneous color palette and larger character sprites). The jump from 720p to 1080p is much less noticeable, even up close, than the jump from 480p to 720p that made Wii games look like muddy, washed-out relics compared to their Xbox 360/PS3 brethren.

It's hard to look at the Xbox One's technically "inferior" 720p output with the same kind of practical concern as those inter-console resolution comparisons of the past. Resolution aside, the games look practically identical, with similar textures, apparent polygon counts, frame rates, and particle effects (like smoke). The small aliasing difference due to the resolution pales in comparison to the similarities in the overall look and feel of both versions.

That's not to say the differences aren't there; it's just that they're not all that significant to my eye. As Digital Foundry put it in its analysis, "the Microsoft console manages to hold up despite the undeniable, quantifiably worse metrics in terms of both resolution and frame rate."

You should be wary of reading too many long-term performance concerns into any resolution differences in launch titles, as well. The evolution of graphical quality on a system has at least as much to do with optimization and allocation of developer resources as it does with raw hardware specs (where the Xbox One and PS4 are quite similar). Furthermore, the overall look and feel of a game lies heavily on overall art direction and craft than raw numbers anyway. Games like Super Mario 64 and Shadow of the Colossus still hold up to this day, despite being in standard definition, thanks to the strong aesthetic sense behind them.

None of this is to ignore the actual differences in resolution between the PS4 and Xbox One versions of at least a couple high-profile, multiplatform launch games. If you're the kind of person who isn't happy unless his gaming rig generates the highest raw benchmark numbers, the PS4 seems to be your console of choice for the time being (though, really, a high-end PC still wins out on this score). If you're the kind of person who values actual gameplay, though, choose your next console based on the games. You can feel secure in the knowledge that, graphically, there doesn't seem to be much practical, noticeable difference in performance.

Promoted Comments

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Kyle, you seem to be someone who usually goes out of his way to get the details right and makes sure to point out flaws that other people gloss over, so I'm not sure where this article came from. You seem to be trying really hard to ignore a pretty large performance difference between the two consoles.

You say that this being an early release means the importance isn't that great since things will be better optimized later on, but I don't get that logic. With both consoles having extremely similar hardware, I think it's safe to assume that both consoles will have similar optimization headroom later on in their life. If that's the case, the PS4 would still be getting much better performance in the late stages of the console's life.

Also, the whole "720p to 1080p only makes a big difference if you are really close" argument is just complete rubbish. A LOT of people sit within five feet of their TVs, especially while gaming. Lower resolutions can really take you out of the game, especially in slower paced areas when you have time to really look at things.

I'm not trying to ignore it... I point out multiple times in this article that it is real and is there. I'm just stating my personal opinion that, practically, it does not make a huge difference in my subjective appreciation of the two scenes, based on what I have seen.

Whether both consoles get "similar optimization" depends largely on which console is more popular, I think. The Xbox 360 got a lot more attention from a lot of developers because of its sales lead in the West, and therefore was able to stretch a lot farther on objectively worse hardware for quite a while when compared to the PS3 (Sony's hardware advantage has arguably won out in the highest end late-generation games, though)

Also, I consider 5 feet to be "really close" for gaming. A quick, informal Twitter survey I just did found almost everyone who answered was 8-10 feet or more from their TVs when playing. Not at all scientific or anything, but it suggests 5 feet isn't the norm. In any case, the fact that some people play close doesn't mean the "720p to 1080p only makes a big difference if you are really close" argument is invalid, just that it doesn't apply to those people.

25 years later, and people are still arguing which console has more sprites, more colors and more parallax scrolling. Except now it requires detailed side-by-side comparisons, professional video captures and the scrutinizing of individual pixels on a 27-inch monitor.

I almost miss the days when the technical differences actually resulted in gameplay and playability differences (SF2, anyone?). At least then, there was a point to that discussion.

Kyle Orland
Kyle is the Senior Gaming Editor at Ars Technica, specializing in video game hardware and software. He has journalism and computer science degrees from University of Maryland. He is based in the Washington, DC area. Emailkyle.orland@arstechnica.com//Twitter@KyleOrl

353 Reader Comments

Kyle, you seem to be someone who usually goes out of his way to get the details right and makes sure to point out flaws that other people gloss over, so I'm not sure where this article came from. You seem to be trying really hard to ignore a pretty large performance difference between the two consoles.

You say that this being an early release means the importance isn't that great since things will be better optimized later on, but I don't get that logic. With both consoles having extremely similar hardware, I think it's safe to assume that both consoles will have similar optimization headroom later on in their life. If that's the case, the PS4 would still be getting much better performance in the late stages of the console's life.

Also, the whole "720p to 1080p only makes a big difference if you are really close" argument is just complete rubbish. A LOT of people sit within five feet of their TVs, especially while gaming. Lower resolutions can really take you out of the game, especially in slower paced areas when you have time to really look at things.

I'm not trying to ignore it... I point out multiple times in this article that it is real and is there. I'm just stating my personal opinion that, practically, it does not make a huge difference in my subjective appreciation of the two scenes, based on what I have seen.

Whether both consoles get "similar optimization" depends largely on which console is more popular, I think. The Xbox 360 got a lot more attention from a lot of developers because of its sales lead in the West, and therefore was able to stretch a lot farther on objectively worse hardware for quite a while when compared to the PS3 (Sony's hardware advantage has arguably won out in the highest end late-generation games, though)

Also, I consider 5 feet to be "really close" for gaming. A quick, informal Twitter survey I just did found almost everyone who answered was 8-10 feet or more from their TVs when playing. Not at all scientific or anything, but it suggests 5 feet isn't the norm. In any case, the fact that some people play close doesn't mean the "720p to 1080p only makes a big difference if you are really close" argument is invalid, just that it doesn't apply to those people.

1080p has been on the market for what, 8 or 9 years now? Why do we still need to have this discussion as if we're trying to justify which console is better. We should be asking, why aren't all of the games 1080p with no-upscaling? I'm not going to get into the PC vs console thing, but why aren't console gamers saying, I want my game to look just as good as the PC counterpart that cost exactly the same (And don't give me that "The PC costs so much" bs, there are plenty of cheap PC gaming rigs).

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

I'm disappointed this clearly self contradictory comment was an editor's pick for this article. At the beginning of this comment, UnnDunn declares that resolution is not a limit of the hardware, while at the end of the comment he explains exactly how it this artistic decision is driven exactly because of limitations of the hardware. You have a point that the console version resolution is somewhat an artistic decision, based on a tradeoff of framerate to graphics, but this is driven entirely by the limitations of the hardware and the scope of what's being rendered, and the technology that is used in the rendering.

Ars editors, please be more skeptical of people who give a bold statement, and then back it up with evidence that proves the exact opposite of that bold statement. (I would ask the same of anyone listening to any political rhetoric.)

Really? Really? You're going to argue semantics?

You understand the point I was trying to make, yes? So argue it, don't rake me over the coals for making the wrong word choice.

One other possible implication is that the Xbone is achieving worse performance despite receiving more developer attention. It sounds like the PS4 version is done, but they admit they are still working on the Xbone one (trying to include ambient occlusion, that we know of).

SSAO totally eats fill-rate and system thoughput; it's also usually implemented as a post-process "plug-in", all you really need is your depth buffer and you can do SSAO with a single shader and one framebuffer (ok, two shaders and three frame buffers if you're going to be all fancy and do some filtering). It certainly doesn't look like they're using a particularly clever AO (they've done the usual thing of putting massive black haloes around everything, which takes a single line of shader code to fix).

Hence if they're saying that AO was turned off for development purposes, the reason they haven't got it turned on in the build they're showing is not that they need to dramatically alter their graphics pipeline, it's that they're desperately scraping around for enough fillrate that it doesn't turn their game into a slideshow.

1080p has been on the market for what, 8 or 9 years now? Why do we still need to have this discussion as if we're trying to justify which console is better. We should be asking, why aren't all of the games 1080p with no-upscaling? I'm not going to get into the PC vs console thing, but why aren't console gamers saying, I want my game to look just as good as the PC counterpart that cost exactly the same (And don't give me that "The PC costs so much" bs, there are plenty of cheap PC gaming rigs).

Because when you're sitting 10 feet away looking at a 50" screen, 1080p doesn't matter as much. Simple. Maybe it would matter more if you had a 70+ inch screen or you sit 4' away from it, but I'm sure the developers did their market research and figured out that most of their target audience do not. So they decided to drop down the resolution so they could offer more eye-pleasing effects.

This article really misses the point of the issue with the Xbox One running games at less than half the resolution of the PS4.

First, let me address the major flaw with this article. It's basically trying to refute an analysis done by Digital Foundry using the videos they posted and saying "oh its not that bad, I can live with it". First of all, the videos they posted in no way represent what the actual final image will look like on your TV. What you're seeing in their videos is extremely compressed to the point where there is artifacting that severely degrades the overall image quality of both presentations. Using that video as an actual reference for what the final image on your TV from those consoles will look like is silly. Its extremely compressed web video! Thats like watching a 1080p video on youtube and saying "Well, those 1080p blu-ray discs must not look all that impressive because this YouTube video looks awful". Streaming, extremely compressed video of a game in no way represents what that final image will look like when the system is connected to your display in your home.

Now on to the original point I want to make. The fact that the games run at less than half the resolution on the Xbox One is significant. It does show that the PS4's GPU is, in fact, 50% faster. This doesn't just affect how it "looks" but also how games will run. When a game demans 60 frames per second, the PS4 will be able to keep that speed going much easier and at much higher quality than the Xbox One. Again, for games that demand it, like racing games, that means the PS4 has less of a chance of dropping frames when you're in a situation where every detail of the road and opponents counts when you're trying to make that final push for the win. It means in FPS when you're playing against other people, you'll be able to both count on having fast controller response time thanks to the higher frame-rate but you'll also be able to see finer details for making more accurate shots compared to the Xbox One. When you're gaming at 1080p versus 720p you have more than twice as many pixels on screen. This means you get to see more of the character models in the distance and you can actually aim for those character models rather having to rely on hit detection and luck like you would on a 720p display.

I have a 55" 1080p display with an IPS panel. At 10' I can see the difference between any sub 1080p resolution and native 1080p content. Larger displays are becoming cheaper and cheaper. Walmart has had 50" displays for under $500 this year. I can imagine Black Friday and Super Bowl sales will bring 55" and bigger displays even further down in price. People will have larger and larger displays, so the difference will be more and more noticeable as time goes on. It does NOT matter how good your scaler is, you can not magically make pixels appear that don't exist. Native higher resolutions will always be better.

Let's not forget the other impacts this difference in resolution means. As I stated earlier, this means that the PS4 GPU is, in fact, 50% faster. The difference in bandwidth between the Xbox One and PS4 as well as the extra CUs on the PS4 GPU means the PS4 will be able to handle higher resolution textures, more complex on screen effects, etc. This difference in resolution is far more significant than people realize.

Also, I consider 5 feet to be "really close" for gaming. A quick, informal Twitter survey I just did found almost everyone who answered was 8-10 feet or more from their TVs when playing. Not at all scientific or anything, but it suggests 5 feet isn't the norm. In any case, the fact that some people play close doesn't mean the "720p to 1080p only makes a big difference if you are really close" argument is invalid, just that it doesn't apply to those people.

You're ignoring your international audience - people in the USA have HUGE houses by the standards of the average Londoner. I'd love to have 10 feet to put between myself and the television.

Come to think of it that might explain why the Kinect wasn't exactly a huge hit in Japan

Also, I consider 5 feet to be "really close" for gaming. A quick, informal Twitter survey I just did found almost everyone who answered was 8-10 feet or more from their TVs when playing. Not at all scientific or anything, but it suggests 5 feet isn't the norm. In any case, the fact that some people play close doesn't mean the "720p to 1080p only makes a big difference if you are really close" argument is invalid, just that it doesn't apply to those people.

You're ignoring your international audience - people in the USA have HUGE houses by the standards of the average Londoner. I'd love to have 10 feet to put between myself and the television.

Come to think of it that might explain why the Kinect wasn't exactly a huge hit in Japan

As a die-hard master-race member, the Xbox One video looks much better, and if it's because it has reduced processing, crushed blacks, or anything else, it doesn't really matter. They should ship it like that.

I understand your tongue was firmly planted in cheek but for me, "it's a feature, not a bug" is not an adequate response to Xbox One black level output problem.

This article really misses the point of the issue with the Xbox One running games at less than half the resolution of the PS4.

First, let me address the major flaw with this article. It's basically trying to refute an analysis done by Digital Foundry using the videos they posted and saying "oh its not that bad, I can live with it". First of all, the videos they posted in no way represent what the actual final image will look like on your TV. What you're seeing in their videos is extremely compressed to the point where there is artifacting that severely degrades the overall image quality of both presentations. Using that video as an actual reference for what the final image on your TV from those consoles will look like is silly. Its extremely compressed web video! Thats like watching a 1080p video on youtube and saying "Well, those 1080p blu-ray discs must not look all that impressive because this YouTube video looks awful". Streaming, extremely compressed video of a game in no way represents what that final image will look like when the system is connected to your display in your home.

Now on to the original point I want to make. The fact that the games run at less than half the resolution on the Xbox One is significant. It does show that the PS4's GPU is, in fact, 50% faster. This doesn't just affect how it "looks" but also how games will run. When a game demans 60 frames per second, the PS4 will be able to keep that speed going much easier and at much higher quality than the Xbox One. Again, for games that demand it, like racing games, that means the PS4 has less of a chance of dropping frames when you're in a situation where every detail of the road and opponents counts when you're trying to make that final push for the win. It means in FPS when you're playing against other people, you'll be able to both count on having fast controller response time thanks to the higher frame-rate but you'll also be able to see finer details for making more accurate shots compared to the Xbox One. When you're gaming at 1080p versus 720p you have more than twice as many pixels on screen. This means you get to see more of the character models in the distance and you can actually aim for those character models rather having to rely on hit detection and luck like you would on a 720p display.

I have a 55" 1080p display with an IPS panel. At 10' I can see the difference between any sub 1080p resolution and native 1080p content. Larger displays are becoming cheaper and cheaper. Walmart has had 50" displays for under $500 this year. I can imagine Black Friday and Super Bowl sales will bring 55" and bigger displays even further down in price. People will have larger and larger displays, so the difference will be more and more noticeable as time goes on. It does NOT matter how good your scaler is, you can not magically make pixels appear that don't exist. Native higher resolutions will always be better.

Let's not forget the other impacts this difference in resolution means. As I stated earlier, this means that the PS4 GPU is, in fact, 50% faster. The difference in bandwidth between the Xbox One and PS4 as well as the extra CUs on the PS4 GPU means the PS4 will be able to handle higher resolution textures, more complex on screen effects, etc. This difference in resolution is far more significant than people realize.

Where do you get this "less than half the resolution on Xbox One" stat? BF4 runs at 1600x900 on PS4 and 1280x720 on Xbox One. By no measure is the Xbox One version running at "less than half the resolution" of the PS4 version.

PS4 version is pushing roughly 50% more pixels. But we already knew PS4 had a more capable GPU.

I'm getting both consoles this gen but they're mainly going to be exclusives boxes; my first love will always be PC. However it REALLY ANNOYS me that for COD games specifically, and a few other titles - the extra launch content and limited editions often aren't available for the PC format!

I'm guessing there's a reason for this but it mystifies me as to what it might be.

25 years later, and people are still arguing which console has more sprites, more colors and more parallax scrolling. Except now it requires detailed side-by-side comparisons, professional video captures and the scrutinizing of individual pixels on a 27-inch monitor.

I almost miss the days when the technical differences actually resulted in gameplay and playability differences (SF2, anyone?). At least then, there was a point to that discussion.

I have a 52" samsung 240hz LED mounted on a wall in an NYC apartment the size of a postage stamp (we bought it when we lived in a muuuuuuuuuuch bigger place), differences in resolution are noticeable to me.

That said, I rarely care too much. I tend to play games that are pretty stylized when I break out the console because if I want crazy graphics I'll just play it on my PC. The higher resolution does make a difference for another reason though, beefier video processing means a higher chance of first party titles that are properly optimized running at 60fps with much higher video resolutions and more detailed textures. The difference in resolution, not too big a deal, but the jump to a 60fps standard for "pretty" games is huge.

As a die-hard master-race member, the Xbox One video looks much better, and if it's because it has reduced processing, crushed blacks, or anything else, it doesn't really matter. They should ship it like that.

I understand your tongue was firmly planted in cheek but for me, "it's a feature, not a bug" is not an adequate response to Xbox One black level output problem.

The "comparison" images you've posted in the article proves nothing.The only real test is viewing on a screen, and whether you're in the 40-50" range, or in the 65"+, the difference between say, 720p and 1080p is apparent.

Play with a magnifying glass... I game on a 106" and on a 40", the difference between 720p and 1080p is easily discernable on both screens.

Even on a 24" monitor the improvement in visual quality from a higher resolution is evident.

I really don't get how you can honestly believe what you've written in this article, sit your ass down in front of a 40" and a 100", connect a PC, start up Battlefield 4, switch between 480p, 720p, 1080p, and then tell me you needed a magnifying glass to spot the difference.

As a die-hard master-race member, the Xbox One video looks much better, and if it's because it has reduced processing, crushed blacks, or anything else, it doesn't really matter. They should ship it like that.

I understand your tongue was firmly planted in cheek but for me, "it's a feature, not a bug" is not an adequate response to Xbox One black level output problem.

Xbox 360 offers that option, but it set to the "limited" setting by default (16-235). It's possible Xbox One has a similar setting, and DF forgot to change it to "expanded" (0-255).

I don't sit 10 feet away from my 46" 1080p screen. I sit about 5 feet away from it, max. I definitely notice the difference between 720p and 1080p, especially when FSAA isn't being completely utilized. Will i notice the difference between 720 and 900? For me, that depends entirely on the quality of FSAA.

What the fuck is wrong with your eyes that you sit five feet away from your 46" TV? Even the goddamn NES controllers had cables longer than that because that's just insane. Do you live in a closet or something?

E: In all seriousness, if you keep that up for long enough you're going to need to sit that close out of necessity rather than preference. Flatscreen panels pump out a ton of HEV light.

Currently I have my PC hooked up to my TV even for gaming and I sit about 5-6 feet away. I have modest setup in which I bounce around from 720-900-1080 and every time a game performs well at 1080p I say "hell yeah".

Unfortunately it sounds like even my modest PC made from years old components is gonna keep up with these consoles. Meaning I'll have to wait and see what good console exclusive games come out for PS4 or get a Wii U since the line-up is looking great for Xmas.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Poor filtrate is an "artistic" decision? Not even slightly, the "artistic" side is constrained by the fact that, if you're going to try to go photorealistic, you need to use even more smoke and even shinier mirrors. 3D graphics is the art of pretending to show more than you actually are.

Stockholm isn't just a city in Sweden...

The artistic decision is whether to go with 1080p or 900p or 720p. You know what the fill-rate is going in. You know what the capabilities of your engine are. The decision is how best to allocate that fill-rate to achieve the most eye-pleasing result. In some cases, higher resolution does not yield the best result.

Again, they could probably run BF4 on Xbox One at 1080p. It probably wouldn't even look that bad. But they decided they wanted more effects, so they turned the resolution down.

No matter how you spin it dropping resolution and trying to smear it all together with DoF/AA/BLOOOOOOOM isn't an artistic decision, it's a technical compromise. If you were making the point that photorealism != artistry, then fine, but I'm not sure that's what you're saying. As I mentioned above, more power available = less effort spent having to fake it and more room for your original ideas.

Put it this way: it doesn't matter how powerful the box is. The console could be ten times more powerful than the fastest 3x GTX Titan/Core i7/32GB monster rig. Running the game at 720p would still allow for more effects than running at 1080p, and developers would still have to make compromises. Dropping to 720p is one such compromise a developer could make, and on a console where you are seated ten feet away, that would make sense.

On a PC where you are seated one foot away, high resolution is a lot more important than on a console where you are 10 feet away.

Are you saying there is more effects on the xbox one version. Because as another poster pointed out DF said the effects between the platforms were the same.

Call of Duty: Ghosts will be running at "1080p upscaled from 720p," compared to a "native 1080p" resolution of the PlayStation 4.

Good article, but I basically forgot the rest of it once I read that sentence. Sorry.

The thing about screen sizes and distances is valid, but consoles have a long lifecycle, and I expect I'll be upgrading the gaming rig TV from 46" to a 60", 72" or 80" screen in 1-2 years. So, resolution doesn't really matter.... until it does.

"an earlier promise by DICE to target "equal performance" on both consoles."

Wouldn't "equal performance" be the 60fps?

The quality and differences would be how much Dice had to cut back keep that framerate. For the PS4 it was cutting one step from 1920 to 1600 pixels, for the XB1 it was cutting two steps from 1920 to 1280.

Imagine a Skyrim remake. Same basic graphics, but when you walk into Whiterun the first time there's a couple hundred NPCs walking around. Imagine the taven is so full at night you have to push your way through the crowd. Vast herds of mammoths. Attacking a fort with 50 Stormcloaks. Carts and wagons constantly traveling the trade routes. A detailed economy running in the background allowing your own trading activities.

They have the resolution they need. Start populating the game worlds more.

That requires CPU power. Which is the main reason this area has been stuck for the last half decade or more because the current/last gen consoles were so CPU limited.

I am with Kyle with this one. Though there is a difference, it wasn't all that apparent initially. One thing I noticed is that there were some items on the Xbone verus the PS4 when zoomed in close. In this intiance it generally had to due with the greater contrast between light and dark. Overall the PS4 did look better when zoomed in and in still images but for me the real issue is how lackluster I am about either console.

The one thing that has resulted from the next generation release for me is a renewed interest, dare I say hope, for PC gaming. After what seems to have been 4 years or so of little in the way AAA games for PC that I would even consider playing on the PC, considering many were terrible ports, there are quiet a few games that out are coming with PC support that look really fun, examples being Titanfall, Destiny, Division, Star Citizen to name a few.

Then there is Steam. Even now you can rig Steam to boot up at launch and I am pretty optimistic about their OS and their controller systems. Especially given that their digital service absolutely trashes Xbox live or PSN in terms of games offered and sales, it looks like Microsoft and Sony screwed the pooch; at least for me anyways. I am planning on not only upgrading my current PC, but also using the money I would have used to get a console a year from now and build myself a little Steam powered console of my own.

"Before we continue, you should probably watch the direct Battlefield 4 comparison video Digital Foundry posted (embedded above). Make sure you push it to full screen and full resolution to get an accurate comparison. It should be clear that you can make out differences in the two images if you're looking for them, especially if you get up close to your monitor and focus on the edges of certain objects."

There is one thing I noticed that would be distinct regardless of how far away you are: the Xbox One's Image was noticeably darker than the PS4, and as a result, slightly sharper when close. (I viewed it twice: once on my TV, and again on my monitor.) To some people it may not be much, but to me it made a huge impact in image believability, as the PS4 seemed to blend slightly better at spots where you're not supposed to notice that it's a game, such as when the helicopter knocked down some of the large smokestacks.

To be fair, however, the job I'm waiting to be called back to involves noticing those sort of details, so I probably have a slight (dis)advantage as a result. Most people probably won't notice or care (the guy with the 133in screen being an envious, obvious exception), but considering the graphics chip came from the same manufacturer, it's one of those things I'm concerned with.

Imagine a Skyrim remake. Same basic graphics, but when you walk into Whiterun the first time there's a couple hundred NPCs walking around. Imagine the taven is so full at night you have to push your way through the crowd. Vast herds of mammoths. Attacking a fort with 50 Stormcloaks. Carts and wagons constantly traveling the trade routes. A detailed economy running in the background allowing your own trading activities.

They have the resolution they need. Start populating the game worlds more.

That requires CPU power. Which is the main reason this area has been stuck for the last half decade or more because the current/last gen consoles were so CPU limited.

Are you sure that the gpu compute or something like OpenCL couldn't offload some of this?

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Xbox One has a better video scaler? Really? That's news to me, since they both use the same Radeon technology from the same generation, only with more cores and ROPs in the case of the PS4.

The original DF article was a hack job full of mistruths, and what they said about the scaler is one of them.

The scaler is integrated into the Radeon GPU, it's the same for both systems. PS4 is not using a software scaler, nor does the Xbox have some kind of magical secret sauce scaler.

I've no idea how the PS4 handles scaling but Xbox One does very much have a dedicated scaler as confirmed here.

We've obviously little idea how they will compare until both systems arrive but the quality of the scaler will be a factor in the future because 4k TVs allow for more custom resolutions than current TVs.

Gran Turismo 5 ran at 1280x1080p, for example. A hypothetical GT7 could run at, say, 1280x2160 (or whatever) and the quality of the scaler could well make a difference at that point.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

I'm disappointed this clearly self contradictory comment was an editor's pick for this article. At the beginning of this comment, UnnDunn declares that resolution is not a limit of the hardware, while at the end of the comment he explains exactly how it this artistic decision is driven exactly because of limitations of the hardware. You have a point that the console version resolution is somewhat an artistic decision, based on a tradeoff of framerate to graphics, but this is driven entirely by the limitations of the hardware and the scope of what's being rendered, and the technology that is used in the rendering.

Ars editors, please be more skeptical of people who give a bold statement, and then back it up with evidence that proves the exact opposite of that bold statement. (I would ask the same of anyone listening to any political rhetoric.)

The statement is true and insightfull, even if it appears to go over the head of some readers.

Also, I consider 5 feet to be "really close" for gaming. A quick, informal Twitter survey I just did found almost everyone who answered was 8-10 feet or more from their TVs when playing. Not at all scientific or anything, but it suggests 5 feet isn't the norm. In any case, the fact that some people play close doesn't mean the "720p to 1080p only makes a big difference if you are really close" argument is invalid, just that it doesn't apply to those people.

But not everybody has small TVs. I sit 10 feet back from an 11 foot screen, which means pixels matter. The PS4 can output 4K video & photos, just not games - which doesn't mean it never will, as it may be a later software update as optimizations make it more feasible.

As a die-hard master-race member, the Xbox One video looks much better, and if it's because it has reduced processing, crushed blacks, or anything else, it doesn't really matter. They should ship it like that.

I understand your tongue was firmly planted in cheek but for me, "it's a feature, not a bug" is not an adequate response to Xbox One black level output problem.

It's not tongue in cheek; I think it actually looks better that way.

If you prefer the image to look like that, you can adjust your monitor settings to crush blacks. You want to turn up the brightness on the TV and monitor so it blooms like crazy? Turn on de-judder on your TV so everything looks like video? You're perfectly free to calibrate your game, TV or monitor as you like. But inaccurate data shouldn't be baked in at the source so that it's impossible for the rest of us who prefer accurately calibrated images to get that.

they're desperately scraping around for enough fillrate that it doesn't turn their game into a slideshow.

When I saw the lack of AO on the XBOne version, I thought the exact same thing. If there weren't any problems with performance or development process, they'd have it enabled and running well right alongside the PS4.

Clearly, there are issues we don't know about and the devs won't fess up to it before launch so as to not anger Microsoft, which I'm sure such a statement regarding lack of oomph in the system would do.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Xbox One has a better video scaler? Really? That's news to me, since they both use the same Radeon technology from the same generation, only with more cores and ROPs in the case of the PS4.

The original DF article was a hack job full of mistruths, and what they said about the scaler is one of them.

The scaler is integrated into the Radeon GPU, it's the same for both systems. PS4 is not using a software scaler, nor does the Xbox have some kind of magical secret sauce scaler.

I've no idea how the PS4 handles scaling but Xbox One does very much have a dedicated scaler as confirmed here.

We've obviously little idea how they will compare until both systems arrive but the quality of the scaler will be a factor in the future because 4k TVs allow for more custom resolutions than current TVs.

Gran Turismo 5 ran at 1280x1080p, for example. A hypothetical GT7 could run at, say, 1280x2160 (or whatever) and the quality of the scaler could well make a difference at that point.

Once again, PS4 also has a dedicated scaler, they're both part of the APU in both systems. Both systems are customized, but ultimately both are using similar building blocks from AMD.

Gran Turismo's resolution has to do with the scaler in the RSX in the PS3. That was an entirely different GPU from Nvidia and had a limitation in that it only supported horizontal scaling, hence the 1280x1080 resolution.

Imagine a Skyrim remake. Same basic graphics, but when you walk into Whiterun the first time there's a couple hundred NPCs walking around. Imagine the taven is so full at night you have to push your way through the crowd. Vast herds of mammoths. Attacking a fort with 50 Stormcloaks. Carts and wagons constantly traveling the trade routes. A detailed economy running in the background allowing your own trading activities.

They have the resolution they need. Start populating the game worlds more.

That requires CPU power. Which is the main reason this area has been stuck for the last half decade or more because the current/last gen consoles were so CPU limited.

Are you sure that the gpu compute or something like OpenCL couldn't offload some of this?

Only if all the actors are doing the same actions. GPU are good at doing the same operation to a lot of data. Unfortunately that is no help when you have a lot of actors that each follows specific circumstances and specific rules. So for many pixels or particles a GPU is good, for many AIs, it is not helping. There has been many attempts to get around this in the last generation but you don't really see any console games with good AI or many independent actors on the same scene.