Platinum Member

OK, found out why. They used different benches. Guru3d used the games inbuilt benchmarking tool, which should more accuately reflect the tough and light parts of the performance. TPU used actual gameplay to bench, and they apparently must have only gone through the lighter parts of the game.

Golden Member

I wish sites would post a video of their test sequence if they choose to roll their own, otherwise it's hard to put any credence in the results.

It also looks like ray tracing can only be set to high or ultra, so when they are testing high, they are really testing what I think most would call medium.

Really good looking game though and the RTX used for GI I think is it's best use case right now. I know someone in a "tear down" video pointed out some spots where the GI was a bit noisy looking but you'd probably never see it while playing unless you stopped to look for it. Always liked the Metro games, might have to pick this one up too.

It's a 1 year exclusive on Epic, then it will be back on steam. The backlash was pretty heavy which is good so I think a lot of other game makers will think twice when offered the same deal.

As for the game, well the lighting is just better. It needs a seriously powerful gpu to manage it (with RTX) but you get next gen lighting if you have it. Reminds me of the olden days when new games came out and actually made gpu's struggle.

Pretty sure it will be back on steam at some point, as they recently mentioned. Just not for some time. I think Epic spokesperson did say this in the last couple of days, as noted in the angry Metro thread over in PC Games forum.

Diamond Member

I'm starting to feel like the enthusiast PC Gaming Nerd is a relic and slowly dying. What I mean is, it seems people (or at least from my perspective) are more accepting of this garbage blur than when card reviewers did Image quality comparisons.

With the introduction of on-the-fly scaling (thanks consoles) there is more support for "as long as it's 4K" versus "it has to be native 4K". Got into a discussion with my oldest nephew who is into gaming, and he doesn't seem to care when I was talking about blurring. "But if your FPS is higher, why does it matter?" I'm like "because it's blurry, it's a blurry mess" and of course "do you stop to look at the scenery or are you playing a game?" Outside of "stop and smell the roses" kind of reply, I had nothing else to say.

If RTX+DLSS gives people more visual punch at the expense of blur, I get the feeling they'll take it. CGI in movies has gotten to the point where you can't even see where a frame starts and ends, it's all one blurred mess. And the kids love it!

Platinum Member

yeah the blur from DLSS is a no-go for me. I buy top end cards for top performance and image quality. RTX BF5 performance has improved a lot since launch though and 4k RTX ultra without DLSS on 2080ti is very playable now for single player imo.

Golden Member

I'm starting to feel like the enthusiast PC Gaming Nerd is a relic and slowly dying. What I mean is, it seems people (or at least from my perspective) are more accepting of this garbage blur than when card reviewers did Image quality comparisons.

With the introduction of on-the-fly scaling (thanks consoles) there is more support for "as long as it's 4K" versus "it has to be native 4K". Got into a discussion with my oldest nephew who is into gaming, and he doesn't seem to care when I was talking about blurring. "But if your FPS is higher, why does it matter?" I'm like "because it's blurry, it's a blurry mess" and of course "do you stop to look at the scenery or are you playing a game?" Outside of "stop and smell the roses" kind of reply, I had nothing else to say.

If RTX+DLSS gives people more visual punch at the expense of blur, I get the feeling they'll take it. CGI in movies has gotten to the point where you can't even see where a frame starts and ends, it's all one blurred mess. And the kids love it!

Most people I know who game don't care about framerate, resolution, and all that good stuff. I can't imagine sinking in nearly 1000 hours on a game locked at 30fps at a weird downscaled resolution when I can play the game at unlocked fps at native resolution.

I have the same issue with Gears of War 4 though for PC. TAA is globally enabled to enhance performance. Just a nice smear of vaseline to increase performance.

Member

I'm starting to feel like the enthusiast PC Gaming Nerd is a relic and slowly dying. What I mean is, it seems people (or at least from my perspective) are more accepting of this garbage blur than when card reviewers did Image quality comparisons.

With the introduction of on-the-fly scaling (thanks consoles) there is more support for "as long as it's 4K" versus "it has to be native 4K". Got into a discussion with my oldest nephew who is into gaming, and he doesn't seem to care when I was talking about blurring. "But if your FPS is higher, why does it matter?" I'm like "because it's blurry, it's a blurry mess" and of course "do you stop to look at the scenery or are you playing a game?" Outside of "stop and smell the roses" kind of reply, I had nothing else to say.

If RTX+DLSS gives people more visual punch at the expense of blur, I get the feeling they'll take it. CGI in movies has gotten to the point where you can't even see where a frame starts and ends, it's all one blurred mess. And the kids love it!

Diamond Member

Got into a discussion with my oldest nephew who is into gaming, and he doesn't seem to care when I was talking about blurring. "But if your FPS is higher, why does it matter?" I'm like "because it's blurry, it's a blurry mess" and of course "do you stop to look at the scenery or are you playing a game?" Outside of "stop and smell the roses" kind of reply, I had nothing else to say.

If RTX+DLSS gives people more visual punch at the expense of blur, I get the feeling they'll take it.

I don't get that feeling at all, and it's based on the same discussion you had with your nephew: he cares about FPS - about how the environment feels in motion - not whether he can stop and enjoy rose petal reflections into puddles.

There's an interesting paradox here: DLSS and RT were introduced as complementary features in RTX, with DLSS purposely aimed at increasing performance in order to compensate for the cost of RT features (reflections, illumination etc). However, in order for this goal to be achieved, the dichotomy is inescapable: DLSS must sacrifice image quality for performance (so that RT can do the exact opposite).

What I suspect you would like to see is DLSS working as an "independent" feature, and that is the DLSS 2X: running the game engine at native resolution and having the AI improve on that. Maybe we'll see that in the future.

Member

According to GN there is a significant performance difference between the benchmark and the actual gameplay.

Honestly I don't think the performance is that bad. Ultra + RTX High = 50 fps avg at 4K on the 2080 ti. That's actually pretty impressive and if you ran at High + RTX High, given the large delta in performance between ultra and high, you could probably run at 4K/60 with RTX.

And it looks like the RTX 2060 is capable of 1080p/60 with RTX on. That's actually pretty good.

Diamond Member

Most people I know who game don't care about framerate, resolution, and all that good stuff. I can't imagine sinking in nearly 1000 hours on a game locked at 30fps at a weird downscaled resolution when I can play the game at unlocked fps at native resolution.

I have the same issue with Gears of War 4 though for PC. TAA is globally enabled to enhance performance. Just a nice smear of vaseline to increase performance.

My post probably came off lacking some details, but this is basically what I mean. I get the image quality is subjective, but I see more "gamers" quick to through native resolution out the window for quasi-4K that introduces a myriad of distortion such as blurriness, rendering issues, and etc.

Oh yeah, that's always been a given. I guess more so now we got our GPU vendors pushing for this stuff that makes it weird. Historically they've pushed for higher frame rates and higher resolutions. Now we see NV dipping it's toes into upscaling. Someone had some great comparison shots (granted still images so you got that) of basically...wait let me just link them.

I don't get that feeling at all, and it's based on the same discussion you had with your nephew: he cares about FPS - about how the environment feels in motion - not whether he can stop and enjoy rose petal reflections into puddles.

If that were true he wouldn't load up mods that essentially cripple his frame rate. He asked me why I cared since I'm the one thats "I just bought a new video card, my rates improved" so he thinks i only care about frame rate. When I showed him my settings, he asked why mine look so sharp, I said I don't use SMAA or TAA, I usa SSAA whenever I can, which is why I often talk about the GPU performance, ie frame rates. He games at like <30 FPS because he uses texture modes that basically destroy his VRAM. I've tried to explain it, but "it runs better than my Xbox" well of course it does, but still...oh well. Let them be.

EDIT: I also setup their PC for TV gaming. So they run at 1080p@60. So their GPU requirements aren't as high. When they come over I switch over to the living room TV which is also 1080p@60, so I got tons of processing power to slather some SSAA in the games they'd most likely want to play (minecraft for the younger ones, and COD for the older ones).

Member

Senior member

Probably a stupid comment, but how is enabling DLSS any different than adjusting the resolution slider in a game's graphical options? To me, it looks like you're loosing a fair amount of sharpness just to gain performance. Except you could do the exact same thing by decreasing the internal resolution a bit and get similar performance boosts, but not lose that much in image quality.

The ray tracing is a different thing, since it looks good in certain areas of the game, but doesn't seem to make much of a difference in other parts of the game. Sounds like to me most people would be happy playing the game with RTX off. To me, it's just a nice bonus; not a "must have" feature needed to enjoy the game.