What’s weird is we already have somewhat reliable idea of what Ryzen7nm with Navi is and to some degree what that would be capable of. So why guess so high knowing your already wrong unless you intentionally lied in your analysis.

What’s weird is we already have somewhat reliable idea of what Ryzen7nm with Navi is and to some degree what that would be capable of. So why guess so high knowing your already wrong unless you intentionally lied in your analysis.

Click to expand...

His entire career is defined by being wrong most of the time. If you make a wrong, outlandish claim, it gets a lot of views from people telling you what an idiot you are. This boosts ad traffic and keeps publications in business. It's how much of modern media works.

You guys are going to feel so silly when Pachter nails this prediction.

Click to expand...

Care to put money where your mouth is? I'm willing to bet not even 5% of games on the next gen consoles will be able to perform at those settings. I say 5% since I could see a few demos or retro titles being tweaked to work on that.

Besides, betting against Pachter isn't exactly a risky proposition, he's wrong 75% of the time. He's so consistently wrong, you'll have statistically better predictions assuming the opposite of what he says is true.

Care to put money where your mouth is? I'm willing to bet not even 5% of games on the next gen consoles will be able to perform at those settings. I say 5% since I could see a few demos or retro titles being tweaked to work on that.

i think realistically the next gen consoles will do 4k / 60fps in some titles and checkboard rendering in others with 60 fps a min requirement and hopefully 1440p @ 90fps per eye for VR , thats what i predict.

Newer AMD video cards are not much faster than the Xbox One X and the rate of improvement for video cards is very slow now just like it is for CPUs.
So my guess is, the Xbox Two will be twice the video horsepower of the One X and the CPU will be a Zen 2 which will be 2-3x faster than the One X.

What I am hoping for is 4K/60Hz on most games. The console may be able to run higher refresh rates however 99% of people use 60hz screens so no developer will waste their time on 120hz frame rates.

What I am hoping for is 4K/60Hz on most games. The console may be able to run higher refresh rates however 99% of people use 60hz screens so no developer will waste their time on 120hz frame rates.

Click to expand...

1080@60 has been out of reach, and you're hoping for 4k@60? Even if it gets the variable refresh rate someone else brought up, it doesn't matter if half the time you spend playing the game it keeps dipping down into "cinematic mode".

So in other words you are clueless. Again textures have basically no impact on performance as long as you have enough vram and your ignorance on the subject changes nothing. They don't even have to use potato textures now and the next generation of consoles will have even more vram.

Microsoft already added 120hz adaptive sync support for their Xbox One, which will be locked to 60hz. 1080p120hz (not sure about 144hz) and 4k60hz is going to happen. For 240hz to occur, I would be skeptical to see native 1080p240hz with 0 downsampling. 240hz is definitely aimed for VR, but I would love to see it for some fast paced games.

Since the new consoles will be x86, full backward compatibility is going to happen very quickly. I expect many existing titles will get a big FPS boost if they are supported. Hopefully most PS4/Xbox One games locked at 30 will go to 60 on the newer consoles over time; as the hardware will support that.

The biggest hurdle will be 4k + high refresh rate adoption. While many 4k TV's support 1080p120hz, most console gamers do not have that ability. 120hz adoption rate could be a hurdle.

We also need to be careful with what resolution these consoles claim to be rendering at. They are gonna cheat their asses off to get 4k above 60fps.

1080@60 has been out of reach, and you're hoping for 4k@60? Even if it gets the variable refresh rate someone else brought up, it doesn't matter if half the time you spend playing the game it keeps dipping down into "cinematic mode".

Click to expand...

A lot of games designed for PS4 Pro and Xbox One X run at 1080p/60. The GPU is powerful enough in both systems to render 1080p/60 but both systems have weak CPUs and are unable to keep up. I think in the next generation, both the CPU and the GPU will be able to run most if not all games at 1440p/60 (maybe some will even push it up to 4K). The problem is, both MS and Sony give the developers liberty to do whatever they want and most find it a lot easier develop games locked at 30fps because a) they value art over frame rate b) with the limited CPU power, it is hard to ensure a constant 60hz.

Given the work Microsoft but into backwards compatibility making the one support both 360 and original Xbox games would say that is a given that will be in the new Xbox. Almost have to say the same for 4K as 4K is supported by both platforms now. so half the stuff is a no shit the next generation will have that stuff. It will also support HDMI, use USB3 and support controllers. It will also support tcp/ip v4 and v6.

So in other words you are clueless. Again textures have basically no impact on performance as long as you have enough vram and your ignorance on the subject changes nothing. They don't even have to use potato textures now and the next generation of consoles will have even more vram.

Click to expand...

It takes time for high-res textures to get loaded into VRAM from the disk, so VRAM bandwidth (data transfer rate) is also important as well on top of the VRAM quantity.
I've seen this first hand with quite a few games, especially DOOM 2016 with the nightmare mode and Fallout 4 with the HD texture pack with a GTX 980 Ti.

With the consoles, the unified memory architectures helps immensely with this task since it normally won't need to copy the memory twice from the disk to RAM to VRAM, but only from DISK to unified RAM, so that is definitely a plus.

It takes time for high-res textures to get loaded into VRAM from the disk, so VRAM bandwidth (data transfer rate) is also important as well on top of the VRAM quantity.
I've seen this first hand with quite a few games, especially DOOM 2016 with the nightmare mode and Fallout 4 with the HD texture pack with a GTX 980 Ti.

With the consoles, the unified memory architectures helps immensely with this task since it normally won't need to copy the memory twice from the disk to RAM to VRAM, but only from DISK to unified RAM, so that is definitely a plus.

Click to expand...

The point is that he is clueless for thinking texture resolution is any type of limitation to getting 240 fps. It is the very last thing that would ever matter if there is enough vram so saying they would need to use potato level textures to get 240 fps is beyond stupid.

The point is that he is clueless for thinking texture resolution is any type of limitation to getting 240 fps. It is the very last thing that would ever matter if there is enough vram so saying they would need to use potato level textures to get 240 fps is beyond stupid.

Click to expand...

You are right, it doesn't look like the higher the resolution of the textures negatively impacts performance.

What this all boils down to is that you won’t lose framerate performance in this game using the High Resolution Texture Pack. You might be affected by some smoothness issues if you have a 4GB video card however. Video cards from 6GB and upwards should not have any smoothness issues.

We also want to emphasize that hard drive performance can help with the high resolution texture map, especially with lower amounts of VRAM. Your hard drive will simply be accesses more to load these large texture assets. An SSD is optimal. A spinning disk may experience longer pausing or hitching as it loads data slower.

Click to expand...

If the number of said textures were to increase (regardless of increasing/decreasing resolution), then that would be a different story as the GPU would have to calculate more of them.
But if the number of said textures remains the same, then yes, the resolution increase normally only relies more on disk access to load said textures into RAM/VRAM/unified RAM, but not overall GPU calculations.

240 FPS at 4k even, I believe it when I see it. This is a bit like the recent AMD rumors, going to have to see more, so far it sounds like hype (too good to be true).

1080p 240 fps I can maybe buy, 4K however....

Click to expand...

Like AMD.... Hmmm....... Your making yourself look like a fanboy...... so maybe post the link to those rumors so you won't be judged for fake news?
The concept is fake. Current PC GPU's can't claim that. Stop the BS.

The point is that he is clueless for thinking texture resolution is any type of limitation to getting 240 fps. It is the very last thing that would ever matter if there is enough vram so saying they would need to use potato level textures to get 240 fps is beyond stupid.

You are right, it doesn't look like the higher the resolution of the textures negatively impacts performance.

Click to expand...

Not exactly, but a lot of people get confused by the difference between a skin and a texture. In the bad old days, a 2-dimensional image used as a wrap for a 3D object was a 'skin'*, and if it was directly proportional to the 3D object it was called a 'map'. A texture was an additional wrap that identified highs and lows on a skin and, depending on the direction of the light source, gave the 2D skin the appearance of a 3D surface. Nowadays there are a lot of different textures that can be applied to a skin or map.

*Note: We think of skins being form-fitted to a particular object, but this wasn't always the case - in the 90's, developers would often make 'skins' or 'wraps' as large, square graphics, and then 'dip' the 3D object into the graphic. This way they could use one graphic for many different purposes (if you played Everquest you saw a lot of this - a grass skin that was used as an animal skin or the leaves of trees, stone that was used for beaches, buildings or cliff walls, wood that was used for bark, doors, docks, and walls, etc.) In 2018 we think of 'skins' as being form-fitting colorizations for guns or clothing, and that kind of makes sense, since they are a 'skin' over an object. But in the old days, this would be correctly referred to as a 'map', since the edges of the graphic corresponded to points on the 3D model.

At some point, a skin that had a texture file attached started to be referred to as a 'texture'. And I can guarantee you, if you increase the resolution of a texture that has a lot of bump, light and opacity mapping (among other things) you'll slow your fps.

Not exactly, but a lot of people get confused by the difference between a skin and a texture. In the bad old days, a 2-dimensional image used as a wrap for a 3D object was a 'skin'. A texture was an additional wrap that identified highs and lows on a skin and, depending on the direction of the light source, gave the 2D skin the appearance of a 3D surface. We think of skins being form-fitted to a particular object, but this wasn't always the case - in the 90's, developers would often make a skin as a large, square graphic, and then use it for many different purposes (if you played Everquest you saw a lot of this - grass that was used as animal skin or the leaves of trees, stone that was used for beaches, buildings or cliff walls, wood that was used for bark, doors, docks, and walls, etc.)

At some point, a skin that had a texture file was called a texture. And I can guarantee you, if you increase the resolution of a texture that has a lot of bump, light and opacity mapping (among other things) you'll slow your fps.

Click to expand...

And I can guarantee you that I can compare performance between low and very high textures in any game out there and the difference will be almost meaningless if there is a difference at all. So again saying we would need potato textures is idiotic as it has nothing to do with being able to hit high framerates or not.

It is obvious that Mr. Pachter (Sony wins again) got it wrong. I mean 240 hertz is not supported in any standard that I know of unless there some form of display port which allows this but certainly not at 4K. That might mean that he heard the number from someone or something . Maybe it is VR related comment where they would have support for 120 hertz per "eye" and even then certainly not at 4K.

I can see 4K 60FPS and/or 240Hz 1080p with both options provided for a la cart type of consumer preference approach. As for 4K/240Hz this analyst is dreaming or really meant to say what I mentioned and clearly not tech savvy.

1080@60 has been out of reach, and you're hoping for 4k@60? Even if it gets the variable refresh rate someone else brought up, it doesn't matter if half the time you spend playing the game it keeps dipping down into "cinematic mode".

I can see 4K 60FPS and/or 240Hz 1080p with both options provided for a la cart type of consumer preference approach. As for 4K/240Hz this analyst is dreaming or really meant to say what I mentioned and clearly not tech savvy.