Op-ed: Why I‘m not too worked up about the next-gen console resolution wars

You can feel secure in the knowledge that, graphically, there doesn't seem to be much practical, noticeable difference in performance.

But being "good enough" is not the same thing as "no difference." The DF piece does NOT say there's no noticeable difference between the PS4 and Xbox One during single campaign gameplay; it says the exact opposite. Curiously, Kyle omits the beginning of the paragraph that pinpoints that difference:

Quote:

On the merits of what we've seen so far, Battlefield 4 is already set to be a formidable launch window effort from DICE. Our observations so far reveal a clear gap in fidelity between PC and PS4, and again to Xbox One, but sub-pixel break-up aside, based on what we've seen so far, the Microsoft console manages to hold up despite the undeniable, quantifiably worse metrics in terms of both resolution and frame-rate.

What Digital Foundry does say is that although the PS4 offers noticeably superior gameplay, not just in terms of image quality but recovering faster from sub-60fps dips in frame rate, the Xbox One gameplay is good as well. That's welcome but not surprising news for Xbox One players but let's not confuse that with saying there's no gameplay difference between the two consoles. According to Digital Foundry, there is.

I don't sit 10 feet away from my 46" 1080p screen. I sit about 5 feet away from it, max. I definitely notice the difference between 720p and 1080p, especially when FSAA isn't being completely utilized. Will i notice the difference between 720 and 900? For me, that depends entirely on the quality of FSAA.

So the PS4 is running at the same frame rate with the same particles, etc. while pushing twice the number of pixels? Doesn't this suggest that the PS4 has twice the graphics capacity? Now, if all games are going to be 900p vs. 720p then that difference won't matter. But what about PS4 games outputting 720p natively? Won't that allow twice the framerate as the XO? I think that would make a difference in game play.

For me the disappointment is more in the fact that this discussion is even taking place. These next iterations of the xbox and playstation were all about adding more power (seeing how very little else on how games are actually played has changed from these two), so the bare minimum should be that every game, especially ones from established capable devs, should be able to output at 60fps 1080p. Otherwise there is little pressing need to have a new generation of these two consoles.

Much more concerning to me are the visibly crushed blacks in the One. This could be due to capture issues, but I do see this from time to time today on the 360 (most heart-achingly, in Mass Effect 3, the presentation of which is almost unplayable on the Xbox). I never have this issue in PS3 games, and Sony generally just seems to get HDMI, which is not a surprise given their long history of making TVs and Blu-Ray players. Really hoping this is not a characteristic of the One itself.

From an Digital Foundry followup article to their BF4 analysis:

Quote:

So what's going on with Xbox One?

Xbox One was also acquired at full range 0-255 RGB with the dash set accordingly, in line with the other versions - exactly how we approach every Face-Off. We have now had informal confirmation from a very good source that current Xbox One hardware has issues with full-range RGB output that are not present on either PC or PS4. We asked DICE for comment and were told that they are "investigating the video output range and settings for Xbox One".

Why isn't anyone else's footage from the same event showing the crushed blacks?

EA provided Elgato Game Capture HD consumer-level devices at the Stockholm event. They're great little units but they are unsuitable for DF work because they cannot acquire 1080p at 60 frames per second and they do not provide lossless video. The reason they do not show crushed blacks on Xbox One is because they operate at limited range RGB (16-235) and the XO dashes were set accordingly. We are told that the current Xbox One appears to be fine operating in limited range RGB, so this is a good match for the unit. We also know that both PS4 and XO hardware at the EA event was set by default to limited range RGB, so other attendees were good to go straight off the bat while we had to make changes to suit our more specialised equipment.

Not enough of a difference now, and if anything - it's likely that the Xb1 will catch up some once devs get used to the ESRAM. Not to mention neither of the engines (COD / Frostbite) are optimized for this generation at all. I suspect we'll see 1080p 60 be pretty standard once they get their grips on things for both systems, though likely the multiplats this gen we saw having a slight advantage on x360 could be reversed.

Now, between native 1080 and 720, I suspect i'd notice some of those, i'm just in the "worth it for 1080" bucket.

1) Both consoles are filtrate-bound below the level of completely living-room standard at the START of this generation. That's utterly pathetic, bodes very poorly for any attempt to implement VR in these consoles, and means that they will be totally outclassed by even ARM-based tablets in a couple of years.

2) The fact that there isn't any difference to your eyes and that this seems less important as you get older doesn't mean that you've become more mature, it means that your visual acuity is getting worse with age

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

It may not matter much now, but there is no way having better graphics performance is a negative, and it's one more positive for the PS4 compared to Xbox One - as if it needed any more to convince gamers.

Higher resolutions require higher resolution textures which require a shit ton more work on the development side which means more expensive games, less low-budget titles, longer development times, and all manner of problems. On the customer side the system will almost certainly eat more power and run hotter, which has a (very) small possibility of raising the console failure rate (if you recall from the 360's RROD problems, heat and consoles don't get along well).

This has nothing to do with either console, honestly - I don't plan on getting either for at least a year after release and I am in fact leaning towards the PS4. But to say there's no possible drawbacks is just plain false.

Higher res textures do not require more work. Often the reference texture is higher res than the final product. Low budget titles don't have to use them if they don't have the money to either. As for the power and heat issue, on the PC side the differences in the CU counts on the AMD GPU doesn't impact things much. These will be within 10 watts of each other.

You can feel secure in the knowledge that, graphically, there doesn't seem to be much practical, noticeable difference in performance.

But being "good enough" is not the same thing as "no difference." The DF piece does NOT say there's no noticeable difference between the PS4 and Xbox One during single campaign gameplay; it says the exact opposite. Curiously, Kyle omits the beginning of the paragraph that pinpoints that difference:

Quote:

On the merits of what we've seen so far, Battlefield 4 is already set to be a formidable launch window effort from DICE. Our observations so far reveal a clear gap in fidelity between PC and PS4, and again to Xbox One, but sub-pixel break-up aside, based on what we've seen so far, the Microsoft console manages to hold up despite the undeniable, quantifiably worse metrics in terms of both resolution and frame-rate.

What Digital Foundry does say is that although the PS4 offers noticeably superior gameplay, not just in terms of image quality but recovering faster from sub-60fps dips in frame rate, the Xbox One gameplay is good as well. That's welcome but not surprising news for Xbox One players but let's not confuse that with saying there's no gameplay difference between the two consoles. According to Digital Foundry, there is.

So the Xbox One version already has frame rate issues without HBAO. And we're supposed to believe they'll turn that on?

Not enough of a difference now, and if anything - it's likely that the Xb1 will catch up some once devs get used to the ESRAM. Not to mention neither of the engines (COD / Frostbite) are optimized for this generation at all. I suspect we'll see 1080p 60 be pretty standard once they get their grips on things for both systems, though likely the multiplats this gen we saw having a slight advantage on x360 could be reversed.

Now, between native 1080 and 720, I suspect i'd notice some of those, i'm just in the "worth it for 1080" bucket.

You can't make up for the lack of processing power. The ESRAM will only be able to smooth over memory bandwidth issues.

It's funny how the press is downplaying this resolution disparity, when going from 720p to 1080p is almost exactly the same amount of increase in pixels percentage wise, as going from 480p to 720p, roughly 125%.

And these silly optimal viewing graphs that keep being posted every time similar topics come up, they're a lot less relevant when we're talking about 3D graphics and not 2D video, since the native render resolution has effects on image quality beyond the number of pixels, related to aliasing and shimmering and other visual artifacts.

It's especially ironic that a site like ars, obsessing over ppi disparities on phones with 5" displays in every one of their reviews should declare this a non-issue.

Never mind the implications this has beyond graphics resolutions, about the relative power of the two kits compared to each other, and how that can affect things going forward in terms of the kind of games that each will be able to enable.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Poor filtrate is an "artistic" decision? Not even slightly, the "artistic" side is constrained by the fact that, if you're going to try to go photorealistic, you need to use even more smoke and even shinier mirrors. 3D graphics is the art of pretending to show more than you actually are.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Poor filtrate is an "artistic" decision? Not even slightly, the "artistic" side is constrained by the fact that, if you're going to try to go photorealistic, you need to use even more smoke and even shinier mirrors. 3D graphics is the art of pretending to show more than you actually are.

Stockholm isn't just a city in Sweden...

The artistic decision is whether to go with 1080p or 900p or 720p. You know what the fill-rate is going in. You know what the capabilities of your engine are. The decision is how best to allocate that fill-rate to achieve the most eye-pleasing result. In some cases, higher resolution does not yield the best result.

Again, they could probably run BF4 on Xbox One at 1080p. It probably wouldn't even look that bad. But they decided they wanted more effects, so they turned the resolution down.

If anything, the Xbox One presentation overall was actually slightly sharper than the PS4 despite the reduction in pixel count. Microsoft claims their scaler on the One is better than that on the 360, and as someone who runs a 360 into a Sony Bravia at 720p because the (much newer) TV does a better job scaling to 1080p, I believe this.

That sharpness you see in the XBO version? That's actually from reduced RGB range in the game which will be fixed shortly, it may look poppier but it's actually crushing blacks (dark grey becomes total black), so despite looking punchier it's actually only removing detail.

Heh, console gamers. This article proves the superiority of PC gaming once again.

I dont think console gamers, had never disputed the fact of the PC superiority in terms of graphics, we know that consoles will be inferior in graphics, always have been and they always will be due their close nature. Now you must wonder, what it is at the console space , that console gamers prefer and choose an inferior product over a superior one, theres really have to be something that the inferior product brings to the consumer that the superior doesnt.

Seriously, who cares? If you are playing a console, you are sitting on a couch 4-8 feet away. You can hardly tell the diff in most cases. GTAV look great on a 42" 1080P display at 6'. At 2' it looks like shit.

Will i notice the difference between 720 and 900? For me, that depends entirely on the quality of FSAA.

If I can notice it on my computer screen with barely a glance, then you'll notice the lower resolution and inferior AA on your 46" TV.

You're right, I more than likely would notice it. But if the anti-aliasing is good enough, I'll be a lot less distracted by it. Of course it would also become more obvious after playing something on my PC gaming rig which is connected to the same TV.

I echo the sentiment that pretty much all the consoles should be putting out at 1080 with decent-good fsaa by now. Yes PCs will always own the scene when it comes to graphics prowess, but i feel like the consoles should be a little further along than what i've been reading in articles like this. I use both and i love both for their respective benefits.

Imagine a Skyrim remake. Same basic graphics, but when you walk into Whiterun the first time there's a couple hundred NPCs walking around. Imagine the taven is so full at night you have to push your way through the crowd. Vast herds of mammoths. Attacking a fort with 50 Stormcloaks. Carts and wagons constantly traveling the trade routes. A detailed economy running in the background allowing your own trading activities.

They have the resolution they need. Start populating the game worlds more.

I feel like this is an oversight. I don't care a whole lot about the resolution itself - more what it implies. Microsoft talked up balance and whatnot, but the resolution and framerate differences make it very clear the hardware is significantly different. The resolution maybe the only difference now, but later that may mean better effects, antialiasing, etc at the same resolution (ie when both do 720p or 1080p in the same game, one will have much more power per pixel). Let alone first party games.

Yes, optimizing around the eSRAM may be a factor, but the ROPs, TMUs, compute units all speak for themselves.

This right here.

Kyle, you seem to be someone who usually goes out of his way to get the details right and makes sure to point out flaws that other people gloss over, so I'm not sure where this article came from. You seem to be trying really hard to ignore a pretty large performance difference between the two consoles.

You say that this being an early release means the importance isn't that great since things will be better optimized later on, but I don't get that logic. With both consoles having extremely similar hardware, I think it's safe to assume that both consoles will have similar optimization headroom later on in their life. If that's the case, the PS4 would still be getting much better performance in the late stages of the console's life.

Also, the whole "720p to 1080p only makes a big difference if you are really close" argument is just complete rubbish. A LOT of people sit within five feet of their TVs, especially while gaming. Lower resolutions can really take you out of the game, especially in slower paced areas when you have time to really look at things.

Imagine a Skyrim remake. Same basic graphics, but when you walk into Whiterun the first time there's a couple hundred NPCs walking around. Imagine the taven is so full at night you have to push your way through the crowd. Vast herds of mammoths. Attacking a fort with 50 Stormcloaks. Carts and wagons constantly traveling the trade routes. A detailed economy running in the background allowing your own trading activities.

They have the resolution they need. Start populating the game worlds more.

That is exactly the sort of thing you can do with Xbox One's cloud processing feature. Calculating behaviors of hundreds or thousands of individual objects driven by detailed AI or physics, deep economies, etc.

Hey, has it occurred to anyone else that the XBone's poor performance probably, by extension, means that AMD's Mantle is not all it's cracked up to be, given that it's been conjectured to basically be the XBone's driver stack?

the metrics emerge on key next-gen launch titles, it's clear that Xbox One is under-performing against its rival - not just according to the spec differential, but actually beyond the difference in raw numbers. Our Battlefield 4 Face-Off preview reveals a 50 per cent resolution boost on PlayStation 4 with no appreciable compromise in effects or performance in single-player gameplay, while Infinity Ward's Mark Rubin confirmed rumours that Call of Duty: Ghosts runs at native 720p on Xbox One, with 1080p a lock for PS4. Assuming uniform features and performance, that's a massive blow for Microsoft.

While Digital Foundry has yet to see either next-gen version of Call of Duty, our experience with Battlefield 4 demonstrates that you can easily see the visual difference between them. The Xbox One version holds up well given the gulf in resolution, but it doesn't require a pixel counter to tell that the PS4 game is crisper and cleaner either. At last week's Battlefield 4 review event in Stockholm, we noted that the resolution change from one version to the next was obvious to many of the press in attendance, with some even suggesting on-site that the PS4 version was operating at native 1080p when its actual resolution was 1600x900. Battlefield 4 is a beautiful game generally, but if it has one Achilles' heel common to both next-gen platforms, it is the pixel-crawl and sub-pixel break-up derived from the post-AA technique. Xbox One has bigger pixels and fewer of them, so naturally the most obtrusive element of the presentation is more of an issue when displayed on the same screen.

The reality for Microsoft is that the raw spec differential it has battled against is not only borne out in what is arguably the most technologically advanced multi-platform game of the next-gen launch, but the gulf actually increases on a title that, on the face of it, isn't pushing boundaries to anything like the same degree. Mark Rubin has previously suggested that there is no new Infinity Ward engine for the cross-generational Ghosts - rather that the studio has continued to build upon the existing tech. The situation is interesting in that we have a piece of technology that almost always favoured Microsoft's current-generation hardware now performing in a vastly superior manner on the competing platform in the next-gen era. It's a stunning turnaround.

That is exactly the sort of thing you can do with Xbox One's cloud processing feature. Calculating behaviors of hundreds or thousands of AI-driven objects.

Here's hoping they do. Open worlds are my video game crack, and all I've ever wanted is a population explosion in that space. Even the post-apocalyptic worlds of Fallout 3 and New Vegas feel too empty.

The bottom line is that the resolution of each game is largely an artistic decision made by the developer, not a limitation of the hardware.

On PC, you have your graphics sliders and resolution setting. As a PC player, you decide which combination of resolution and graphics settings gives you the best visuals and framerates.

On console it's the same, except it's the developer tweaking the settings, not the player.

So if it's 720p, it's because the developer decided that resolution netted the best combination of visuals and framerate for that hardware. They could run BF4 on Xbox One at 900p or even 1080p if they wanted, but they'd have to "move some of the graphics sliders left a bit". Or have a reduced framerate.

We know, PS4's GPU is better than Xbox One's GPU. But Xbox One has a better video scaler, so it does a better job upscaling lower-resolution rendering. So developers can get away with running the game in lower resolution and turning the graphics sliders up a little bit.

Poor filtrate is an "artistic" decision? Not even slightly, the "artistic" side is constrained by the fact that, if you're going to try to go photorealistic, you need to use even more smoke and even shinier mirrors. 3D graphics is the art of pretending to show more than you actually are.

Stockholm isn't just a city in Sweden...

The artistic decision is whether to go with 1080p or 900p or 720p. You know what the fill-rate is going in. You know what the capabilities of your engine are. The decision is how best to allocate that fill-rate to achieve the most eye-pleasing result. In some cases, higher resolution does not yield the best result.

Again, they could probably run BF4 on Xbox One at 1080p. It probably wouldn't even look that bad. But they decided they wanted more effects, so they turned the resolution down.

No matter how you spin it dropping resolution and trying to smear it all together with DoF/AA/BLOOOOOOOM isn't an artistic decision, it's a technical compromise. If you were making the point that photorealism != artistry, then fine, but I'm not sure that's what you're saying. As I mentioned above, more power available = less effort spent having to fake it and more room for your original ideas.