Nvidia Shares RTX 2080 Test Results: 35 - 125% Faster Than GTX 1080

Under certain conditions, 50%< most likely across the board without the use of AI-powered anti-aliasing.

We’re not expecting to average 50%-higher frame rates across our benchmark suite. However, enthusiasts who previously speculated that Turing wouldn’t be much faster than Pascal due to its relatively lower CUDA core count weren’t taking underlying architecture into account. There’s more going on under the hood than the specification sheet suggests.

This is what I mentioned earlier. Now the 2080 Ti has almost 50% more Turing CUDA cores than the 2080, holy shit, that thing is going to be a beast.

Hardware support for USB Type-C™ and VirtualLink™(1), a new open industry standard being developed to meet the power, display and bandwidth demands of next-generation VR headsets through a single USB-C™ connector.

New and enhanced technologies to improve the performance of VR applications, including Variable Rate Shading, Multi-View Rendering and VRWorks Audio.

Not me, I'm a day 1 purchaser, Got 5k set aside for a new rig and CV2 lol. I most likely will wait for the 1180TI though before upgrading GPU. Hopefully my 1080 can push it enough for the few months wait.

Remember kids, August 21st to 25th is GamesCom, where Nvidia has teased "amazing surprises", rumour is the new GPU's with gamer-centric features like RTX (Real-time raytracing technology) will be revealed. Possible inclusion of the Virtual Link port dedicated to VR may also be announced.

And a month later in Sept (26th and 27th) is Oculus Connect where we are bound to see a new hardware reveal (possible pre-order?) and game announcements.

I predict Nvidia has pushed the GPU announcement for GamesCom instead of their own keynote to announce cards because one month later Oculus will announce the CV2 (I am starting to think that Santa Cruz is not coming yet, they will let GO breathe and focus on CV2 this year with SC next year, that is what I am feeling at the moment)

I would love this however tbh I doubt it. Oculus have high end pc covered and entry level mobile. I suspect high end mobile is something they will want to cover before replacing their PC HMD. I hope I am wrong!In principle I am a day 1 CV2 buyer but it does depend on price and technology. IF it uses VRLink and if the 1180 has that port that will be a point in its favour as I can free up a usb port.... (Though not sure what that will mean for extension cables... I do not want to go back to 3m cable . 5m is the minimum I need

Though not sure what that will mean for extension cables... I do not want to go back to 3m cable . 5m is the minimum I need

That's a good point. Virtual Link is using USB-C with a USB3.1 channel (and 4 display port lanes). The USB-C standard only supports 2m cables for USB3.1 Gen 1 and 1m cables for Gen 2. Hopefully Virtual Link is changing the standard far more than it sounded like.

Hardware support for USB Type-C™ and VirtualLink™(1), a new open industry standard being developed to meet the power, display and bandwidth demands of next-generation VR headsets through a single USB-C™ connector.

New and enhanced technologies to improve the performance of VR applications, including Variable Rate Shading, Multi-View Rendering and VRWorks Audio.

We will no doubt see those 3 features on the new Geeforce RTX cards. Press Release:

RTX Real-time rendering (rumoured to be in new Geeforce cards too) will take Movie VFX and Games to a new level of photorealism, all major engines (Unreal/Unity) will have support, including all the latest rendering engines for CGI

AI and machine learning computational magic called DLLA that takes a lower res image and upsamples it to create a super high-quality image.

in VR screen space based effects like reflections, ambient occlusion and global illumination are impossible, so having dedicated real-time raytracing cores will usher a new level of realism and graphics fidelity in VR with the new series.

The problem with that is that these bells and whistles will only be available for Nvidia cards. As a developer I'm not going to waste my time having these features in my game if AMD cards can't benefit from them.

"This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

The problem with that is that these bells and whistles will only be available for Nvidia cards. As a developer I'm not going to waste my time having these features in my game if AMD cards can't benefit from them.

True for pancake games (AMD is working on their own version, this will be HairWorks all over again) in terms of Oculus Home VR games, this is the latest report on GPU's from Oculus owners:

I'm excited for new video cards, but I hope that the ray tracing stuff makes it into open cross-vendor standards. Having a game tied to one vendor, and only a small set of newest cards from that vendor, is not tenable.

The problem with that is that these bells and whistles will only be available for Nvidia cards. As a developer I'm not going to waste my time having these features in my game if AMD cards can't benefit from them.

Fair point... To be honest if Nvidia support dx12 properly it will be a positive step (10 series doesn't).

Whilst it is not to the same degree as Turing I believe amd have supported hardware ray tracing for some time but was not supported because of the reason you state

Plus there were others before that, going back to the Saarland University RPU that was used for all those raytraced quake 3 videos that came out 16 years ago.

Good to see more ray tracing though. I've written numerous ray tracers, and I use a ray tracer I wrote as an optimisation exercise for my students (since it's the easiest system to learn multithreading on, plus they can get into mathematics optimisation, spatial partitioning, gpgpu, network distributed rendering, etc).

Plus there were others before that, going back to the Saarland University RPU that was used for all those raytraced quake 3 videos that came out 16 years ago.

Good to see more ray tracing though. I've written numerous ray tracers, and I use a ray tracer I wrote as an optimisation exercise for my students (since it's the easiest system to learn multithreading on, plus they can get into mathematics optimisation, spatial partitioning, gpgpu, network distributed rendering, etc).

One of the interesting things with pure ray tracing is that you can generate rays that match the panels and lenses of a vr headset.

Normally apps render at a certain resolution, then the oculus sdk distorts that based on the inverse of the lens distortion and sends it to the panels. So what is rendered isn't the same as what is shown, there isn't a 1:1 texel to pixel relationship across the whole view. To make the centre of the view 1:1, you end up rendering too much for the sides. If you make the sides 1:1, the centre is low res.

But generating rays based on the lens distortion directly could help a lot with that, because the distortion step is removed. Mixing that with foveated rendering shouldn't be too hard either.

Although that wouldn't work with hybrid rendering with scanline rasterising, since that still has to go through a distortion step.

In this video, there are clues where everyone has found them on twitter.

You're fast - I didn't read your posts before now and can see I made similar posts in another thread yesterday, sigh. Well, at least the message gets around

I can still remember all the hype for Geforce3 in 2001 (I bought one back then for something like $800 - the Asus Deluxe version ) Back then there were amazing videos showing early Doom3 realtime calculated performance - and we were all amazed. But when Doom3 finally arrived, Geforce 3 wasn't really interesting anymore. In short, I'm not into a lot of tech-talk about new features that may or may not become important for new VR games and apps, I just want to see the raw benchmarks in current games

Show-me-the-numbers!

PS. Fun thing - you can see the amazing realtime GeForce3-rendered Doom images here - from an article I wrote in 2001, I got the images from VoodooExtreme, those were the days

Thus I'm getting numb to all the amazing things Nvidia might promise new cards can do. I'll just be very happy if the RTX 2080 is at least 50% faster in 4K gaming than my GTX 1080 - and if Nvidia can deliver 180-200w power consumption.