3DMark benchmarks have been putting PCs through their paces for over 20 years, and last month 3DMark’s latest test went live. Called Port Royal, the new benchmark moves users through a ray-traced environment, which features real-time ray-traced reflections and shadows, to test the performance of GeForce RTX GPUs and their RT Cores, the tech that makes real-time ray tracing a reality.

Now, DLSS has been added to Port Royal, enabling GeForce RTX users to see the performance and image quality benefits of this game-changing technology.

In our testing, DLSS boosts Port Royal performance by up to 50%, increasing framerates across our complete range of GeForce RTX GPUs:

Predictive AI enhancement for a static benchmark is meaningless. Couldn't care less about this. Until its in a game with ray tracing its just a glorified AA solution.

Isn't it supposed to be just a glorified AA solution? In fact, that is what it would be best at IMO, give me an image rendered at 4K and use the neural net to do good AA without needing to super sample. &#x1f61b;

Operating System

Isn't it supposed to be just a glorified AA solution? In fact, that is what it would be best at IMO, give me an image rendered at 4K and use the neural net to do good AA without needing to super sample. &#x1f61b;

isn't it the opposite of supersampling? in supersampling the screen is rendered at a higher resolution then downscaled with a loose scaler.

while DLSS on the other hand uses a lower resolution then upscaled with a smart scaler.

Quote:
Originally Posted by ZealotKi11er

Doesn't it actually look better with DLSS from that video?

probably because the render benchmark had predetermined frames, where the trained AI is already aware of what to adjust to make it better.

trolling an adult is very dangerous, don't try it at home nor at work. you don't want to play tag with a rabid man.

ULmark and their DLSS demos are load of rubbish. The TAA I've seen them implement is one of the worst ever and who the hell uses TAA in first place when there is MSAA, SMAA, TSSAA8TX, etc.
Compare 1440p vs 1440pDLSS and 4k vs 4kDLSS, not these fake4k upscaling where it's comparing 4k vs 1440pDLSS, of course 1440pDLSS is faster than rendering native 4k, duh!

Nvidia probably still waits to enable the DLSS2x which is the native DLSS mode without upscaling. They likely know every well it's slower or not really beneficial over traditional AA methods.

isn't it the opposite of supersampling? in supersampling the screen is rendered at a higher resolution then downscaled with a loose scaler.

while DLSS on the other hand uses a lower resolution then upscaled with a smart scaler.

DLSS doesn't have to use a lower resolution, Nvidia could simply train a neural network to do great AA. When we first heard about it this is what it sounded like. Lower resolution is faster but I bet a neural network could offer great AA quality faster than 4xSSAA working with the full resolution.

Quote:
Originally Posted by epic1337

probably because the render benchmark had predetermined frames, where the trained AI is already aware of what to adjust to make it better.

Yes, but there is only one AI and a lot of frames, so it still has to work well on all the frames with only one set of weights. Nvidia is probably not using a separate neuralnet for each frame.

The variety of frames is still really low compared to a game due to the fixed nature of the benchmark, but it is more a best case for DLSS rather than completely cheating.

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.

User Name:

If you do not want to register, fill this field only and the name will be used as user name for your post.

Password

Please enter a password for your user account. Note that passwords are case-sensitive.