I honestly feel like a lot of games don’t really need a new generation- look at Fortnite, it runs on a phone all the way up to a high spec PC, or Indy games like Gris, Below, The Messenger etc. A lot of games just don’t need crazy power and that’s why I think we’ll see a ton of stuff that simply runs on a “family” of PlayStation devices (running at 4K/60 on PS5).

Only things from first party studios will be “exclusive” to the PS5- Horizon 2 or God of War 2 for example. I think things from 3rd parties will continue to be cross-gen because of the massive install base.

I doubt this turns out to be true. Developers have more reason to go exclusive next-gen than cross-gen due to the significant CPU improvements that could end up affecting general game design in a lot of ways.

Alt-Account

You'll see more AAA games that look significantly better probably a year into next gen.

That Cyberpunk demo was running on a 1080Ti. What are the current speculations for next-gen GPU performance? I think what you saw there is more or less what a cross-gen game is going to look like, so basically next-gen games through 2021.

I don't understand why oftentimes devs show visual fidelity that is not possible on their current gen HW. Specifically gameplay visual like screens above! The final product is almost a generation behind! This is embarrassing.

RDR2 is the new graphics benchmark. Absolutely stunning visually. I honestly don’t know how they achieved what they made, even on base PS4. Every time I play it I’m gobsmacked. I know R* had a HUGE budget and all R* teams worked on it for nearly a decade. It just shows what can be done if they have the resources.

The people only expecting current games at native 4k/60fps will be very, very pleasantly surprised.

Right now everything you see from the geometric complexity of the Worlds to the lighting engine to all the physics simulations are built around a base Xbox One console, at least for multiplatform games and Xbox exclusives. Even PS4 exclusives are constrained by the now ancient base PS4 tech.

Companies are going to go from having a 1.6GHz Jaguar CPU and a 1.3/1.8tflop GPU as a base to a 2.8-3.2GHz Ryzen CPU and a ~12tflop GPU as a base.

The visual differences outside of image quality (for those with an X) will be momentous.

Even supposed cross gen games like Death Stranding and Cyberpunk will look out dated very quickly compared to next gen exclusive titles.

If PS5 is a cut down Navi 10 (very likely), it's gonna be around 10tflops & a lot of that will be wasted on 4K, the resolution chase sucks up so many GPU resources it's not funny, unless devs start aiming for 1080p as a base, the difference won't be that huge, also things like better shadows, AO, draw distance etc, those 10+tflops go fast.

I doubt this turns out to be true. Developers have more reason to go exclusive next-gen than cross-gen due to the significant CPU improvements that could end up affecting general game design in a lot of ways.

If PS5 is a cut down Navi 10 (very likely), it's gonna be around 10tflops & a lot of that will be wasted on 4K, the resolution chase sucks up so many GPU resources it's not funny, unless devs start aiming for 1080p as a base, the difference won't be that huge, also things like better shadows, AO, draw distance etc, those 10+tflops go fast.

10tflops is still an almost 10x leap in base specs for devs to target vs the 1.3tflops of this gen. I don't get the "wasted 4k" nonsense. Image quality is very important to the way a game looks so it's not a waste at all.

Certain games will also continue to use checkerboard rendering, sparse rendering, AI rendering, dynamic resolution and all types of anti aliasing solutions.

People are also getting way too hung up on tflop numbers. Next gen only games will be jaw dropping with a massive leap in every facet of rendering from lighting, geometry, materials, textures and effects.

10tflops is still an almost 10x leap in base specs for devs to target vs the 1.3tflops of this gen. I don't get the "wasted 4k" nonsense. Image quality is very important to the way a game looks so it's not a waste at all.

Certain games will also continue to use checkerboard rendering, sparse rendering, AI rendering, dynamic resolution and all types of anti aliasing solutions.

People are also getting way too hung up on tflop numbers. Next gen only games will be jaw dropping with a massive leap in every facet of rendering from lighting, geometry, materials, textures and effects.

They will have all of what you said, at 1080p/30fps.
a 2080 Ti struggles with current gen games at 4K with higher settings, there just isn't the hardware available for both 4K & a true next gen leap, PS4 & PS5 will both be using GCN archicture, just PS5 will have one with around 5x (2.5x for Pro) more grunt & some neat bonuses like primitive shaders (if AMD doesn't cancel them again lol) PS3's honestly quite mediocre & outdated RSX even in 2006 to the custom HD 7850/70 PS4 had was an enormous leap not just in performance but architecture wise, that leap is just not happening this time, especially not when compared to the PS4 Pro.

If PS5 is a cut down Navi 10 (very likely), it's gonna be around 10tflops & a lot of that will be wasted on 4K, the resolution chase sucks up so many GPU resources it's not funny, unless devs start aiming for 1080p as a base, the difference won't be that huge, also things like better shadows, AO, draw distance etc, those 10+tflops go fast.

I don't know man. RDR2 already looks ridiculous and runs at 4K30 on 6TFLOPS. You add another 4TFLOPS and a better CPU, and stay at the same resolution and framerate, that seems like a lot of elbow room leftover. Plus all these games we're looking at right now were ultimately still optimized around the base PS4.

They will have all of what you said, at 1080p/30fps.
a 2080 Ti struggles with current gen games at 4K with higher settings, there just isn't the hardware available for both 4K & a true next gen leap, PS4 & PS5 will both be using GCN archicture, just PS5 will have one with around 5x (2.5x for Pro) more grunt & some neat bonuses like primitive shaders (if AMD doesn't cancel them again lol) PS3's honestly quite mediocre & outdated RSX even in 2006 to the custom HD 7850/70 PS4 had was an enormous leap not just in performance but architecture wise, that leap is just not happening this time, especially not when compared to the PS4 Pro.

The problem with your line of thought is that you are using the wasteful/inefficient higher settings offered in the pc version of some games to try and draw performance parallels to efficient solutions designed from the ground up to target a higher spec. Its funny that people continually repeat this nonsense every generation

I don't know man. RDR2 already looks ridiculous and runs at 4K30 on 6TFLOPS. You add another 4TFLOPS and a better CPU, and stay at the same resolution and framerate, that seems like a lot of elbow room leftover. Plus all these games we're looking at right now were ultimately still optimized around the base PS4.

The problem with your line of thought is that you are using the wasteful/inefficient higher settings offered in the pc version of some games to try and draw performance parallels to efficient solutions designed from the ground up to target a higher spec. Its funny that people continually repeat this nonsense every generation

This generation isn't the same as the others, as i said, Navi is just another GCN refresh, there isn't even a next gen architecture at play. I wouldn't say those effects are wasteful either, it's just devs have to make a compromise somewhere, the low quality volumetrics in RE2 & God of War are not efficient, just low quality.

I'm open to a next gen Final Fantasy going in so many potential directions. Although, I am still waiting for Final Fantasy battles to completely merge with cutscenes in one seamless presentation. I've always loved of thought of essentially playing a Visual Works cutscene in realtime. Square Enix has been flirting with this idea for a while now.

With Nomura finding himself directing a Final Fantasy yet again, I would love to see pivotal cutscenes reworked as playable set pieces in the FF VII Remake. Considering you actually had (very limited) input in the original.

They're going to look like the best looking current gen games like God Of War, Detroit, or Dragon Quest XI, but bumped up. And of course there will be games that go beyond that and also all kinds of smaller productions that look like they could have been done on current gen hardware, not counting retro indies and stuff like that obvs.

10tflops is still an almost 10x leap in base specs for devs to target vs the 1.3tflops of this gen. I don't get the "wasted 4k" nonsense. Image quality is very important to the way a game looks so it's not a waste at all.

Certain games will also continue to use checkerboard rendering, sparse rendering, AI rendering, dynamic resolution and all types of anti aliasing solutions.

People are also getting way too hung up on tflop numbers. Next gen only games will be jaw dropping with a massive leap in every facet of rendering from lighting, geometry, materials, textures and effects.

I’m equally interested in the extra cpu power making games more realistic looking and atmospheric. Such as increased density of npc in cities and them moving around in the world doing stuff. Also the additional memory used to banish indoor/outdoor loading screens.

Think this post turned into a subconscious request for a next gen elder scrolls ...

4K will become standard but the consoles (and any mid gen refresh) won't be powerful enough for 5K. i think we'll get more powerful CPUs but i'm not sure if that'll allow for 4K and 60fps at the same time except in exclusive titles from Sony/Microsoft. games will likely just become more detailed with better animations/AI etc and on a much larger scale in terms of map size/density. there will of course be some improved visuals but the biggest graphics improvement recently is real time raytracing and that'll be a PC thing for the next decade or so because it won't be coming to next gen consoles. next next gen consoles at the earliest which will be near the end of the 20's. well, i suppose consoles could just used pre-rendered baked in raytracing. i think some games have already done that.

I'm always confused by these graphical downgrades. If the original images as in Watch_Dogs and The Witcher 3 were running on real hardware at some point, what would that hardware have been? 2x 980Tis in SLI? And if so, why was there any need to downgrade it at all for PC users? Was it simply to avoid platform wars between the PC and consoles?

I'm always confused by these graphical downgrades. If the original images as in Watch_Dogs and The Witcher 3 were running on real hardware at some point, what would that hardware have been? 2x 980Tis in SLI? And if so, why was there any need to downgrade it at all for PC users? Was it simply to avoid platform wars between the PC and consoles?

Once again people are underestimating a new gen.
I remember people saying we won't get star wars 1313 graphics this gen.

However the leap will be smaller because of diminishing returns.

But things like this in VR should be possible next gen.

Skip to 1hr43m
Epics tech demos have been a pretty good indicator in the past, so I think there recent graphics demos are a good bet.
One thing that is interesting is that UE4 is able to produce far better visuals then what is possible on x1/ps4, then again I suppose UE3 did aswell.

For me, graphics are good enough on the mid gens.
I hope devs focus more on world interactivity, world + Npc realism. We have a long way to go, but these things will make a game just as immersive as more realistic graphics.