LegendVeteranSubscriber

From an initial impression, the effect of having only 6GB VRAM is much lesser than I thought it would be, but it's still weighing a lot on the minimum framerates of Wolfenstein II at least.
This card needs a frametime comparison. I'd wait for techreport's review before final judgment.

Legend

How do prices of 1070 versus 2060 compare? Positioning 2060 as a 1080p raytracing GPU seems reasonable to me so far.

Click to expand...

Looks like right now the MSRP is 30 USD less than the 1070 and much slower (at 1080p) in general than the 1070 as well. I haven't seen any benchmarks of it with RT enabled titles though, so I'm skeptical how well it'll do for games.

LegendVeteranSubscriber

So RTX 2060 is faster than GTX 1080 already? Yesterday, when I checked the reviews, average performance was at the level of GTX 1070 Ti. I'd say RTX 2060 is faster in high-FPS situations, slower in low-FPS situations. Significantly faster than GTX 1070, but no way better than GTX 1080. As for GTX 1070 Ti, its 8GB memory and more stable performance seems to make it a better solution.

Click to expand...

The RTX 2060 seems to be equal or slightly faster than the GTX 1080 in compute-heavy games while they're not bottlenecked by fillrate or VRAM amount.
One such example is Wolfenstein II before reaching 4K, even more because it's probably taking advantage of 2x FP16 throughput in Volta ALUs.

So until we get more solid info on the impact of 6GB VRAM on frametimes (which are still scarce), I think it's safe to say the RTX 2060 stands head and shoulders above the GTX 1070 Ti.
As games tend to become more compute-centric and eventually make more use of FP16 pixel shaders, the RTX 2060 could be a safer long-term bet than the GTX 1080.

LegendRegularSubscriber

@Picao84 will you please find someone else to fixate on? I'm not even slightly interested in this back&forth of yours.

The RTX 2060 seems to be equal or slightly faster than the GTX 1080 in compute-heavy games while they're not bottlenecked by fillrate or VRAM amount.
One such example is Wolfenstein II before reaching 4K, even more because it's probably taking advantage of 2x FP16 throughput in Volta ALUs.

So until we get more solid info on the impact of 6GB VRAM on frametimes (which are still scarce), I think it's safe to say the RTX 2060 stands head and shoulders above the GTX 1070 Ti.
As games tend to become more compute-centric and eventually make more use of FP16 pixel shaders, the RTX 2060 could be a safer long-term bet than the GTX 1080.

Click to expand...

VRAM would be indicative of the target resolution then?
Is 6 enough for proper 4K experience, I mean, just looking at the industry at large but 6 seems like it's cutting it thin. X1X has 12 with at least 9 dedicated for non-OS tasks. Most of the other higher end cards are operating with at least 8.

LegendVeteranSubscriber

VRAM would be indicative of the target resolution then?
Is 6 enough for proper 4K experience, I mean, just looking at the industry at large but 6 seems like it's cutting it thin.

Click to expand...

I don't know.
If you look at 4K average framerates on the RTX 2060, save for an odd duck like RE7 it looks like 6GB VRAM are indeed enough for now. But we still need to look at frame times because if VRAM fills up then the GPU will have to wait for system RAM data from the PCIe bus.

I suspect nvidia may be doing a colossal driver work on a game by game basis to get the driver to select what is placed in the VRAM at 330GB/s and what's being slowly streamed through the PCIe bus at 15GB/s. Like what AMD reportedly did with Fiji, and ended up developing HBCC to avoid that weight on the driver dev team.

Then again, word is that nvidia already did some of that work on the GTX 970 to avoid putting latency-sensitive data on those last slow 512MB of VRAM.
So maybe the work isn't colossal after all, and they have some automated tools that they only need to tweak for each game.

X1X has 12 with at least 9 dedicated for non-OS tasks. Most of the other higher end cards are operating with at least 8.

Click to expand...

Yet not everything in those 9GB need a 320GB/s bandwidth. Perhaps the XboneX would be much better served with fast 4GB at 512GB/s plus 16GB at 15GB/s duplex (if it didn't need to do BC with XBOne games, that is).

LegendRegularSubscriber

I don't know.
If you look at 4K average framerates on the RTX 2060, save for an odd duck like RE7 it looks like 6GB VRAM are indeed enough for now. But we still need to look at frame times because if VRAM fills up then the GPU will have to wait for system RAM data from the PCIe bus.

I suspect nvidia may be doing a colossal driver work on a game by game basis to get the driver to select what is placed in the VRAM at 330GB/s and what's being slowly streamed through the PCIe bus at 15GB/s. Like what AMD reportedly did with Fiji, and ended up developing HBCC to avoid that weight on the driver dev team.

Then again, word is that nvidia already did some of that work on the GTX 970 to avoid putting latency-sensitive data on those last slow 512MB of VRAM.
So maybe the work isn't colossal after all, and they have some automated tools that they only need to tweak for each game.

Yet not everything in those 9GB need a 320GB/s bandwidth. Perhaps the XboneX would be much better served with fast 4GB at 512GB/s plus 16GB at 15GB/s duplex (if it didn't need to do BC with XBOne games, that is).

Click to expand...

I feel as though we've had this discussion somewhere on B3D. We did talk about the amount of vram necessary required for GPUs. I suspect that certain vritual texturing /streaming options could get away with less ram, and there are times where I think more ram would be required. But how much does resolution actually impact here? I sort of agree that 4K frame buffers can be big, but 6 GB should cover it. So are we really talking about a discussion of 4K at ultra settings or high resolution textures which is where VRAM amounts are important?

RegularNewcomer

Yes, in wolfenstein its actually faster, its a modern title optimized for modern hardware. This likely wont decrease with future drivers and titles. I would never opt for a 1080 before a RTX2060 now, and especially not in a year or two.

Things might balloon in the future if devs go for more volume texture storage or as the 2020 consoles raise the target bar anyway.

Click to expand...

This. A 2060 easily can handle 4k with its 6gb, i dont think the One x is using that much more for Vram. Could be around that actually, 6gb.
But its certain the next consoles are going to sport 16gb if not more, 11gb vram will be the mininum for pc games then. Almost double the 2060. The 6gb 2060 has a limited future regarding 4k/highest settings then. I dont think a 8 to 11gb 2060 is going to happen.

I would never opt for a 1080 before a RTX2060 now, and especially not in a year or two.

Click to expand...

I would prefer even GTX 1070 Ti. Maybe even GTX 1070, because these don't suffer from microstuttering like RTX 2060 does. If the issue is caused by insufficient VRAM capacity (already at launch), than it will get even worse in time.

LegendVeteranSubscriber

The 2060 would be benchmarked at a wide range of resolutions and quality settings, but time constraints mean websites are going to limit their attentions to targets they think represents their audience.

Click to expand...

Time constraints intentionally forced upon reviewers by Nvidia who know what they're doing and get the cards to them just before CES when any embargoes are lifted. So they get a day to do tests and record content knowing they have to get the review out and then attend CES.

Regular

Regarding memory constraints, it might be fun to see what a GTX 980 Ti can do, as part of the stable of cards used to compare against the RTX 2060. Especially fun if they compare overclock to overclock as that series realy gets a boost from an OC.

It too has "just" 6 GB of ram, and it ran very close to the GTX 1070 in the benchmarks I saw around when that card was popular. IIRC, there was scant mention of it bumping into any kind of a memory ceiling that limited it, rather than just the work load.

P.S. I think that the RTX could bump into a memory ceiling in some titles, but I think it likely that in those scenarios it will only be running at an iffy level of frames per second. So having to sacrifice being able to run at 4k wouldn't be much of a sacrifice to most people. They could dial the settings to get an average of 60 fps, and then I think they'd no longer be getting stuttering from pushing their memory buffer too hard.

That should be the case, otherwise cards with 8 GB of ram would also be hitting their buffer too hard when running at Ultra, at 4k.

It's not like the RTX 2060 has nearly the same oomph of a RTX 2080, albeit with cut down memory. My RTX 2080 struggles with some titles at 4k, and so I drop the resolution to lower (custom) ones. For the RTX 2060 the amount of ram looks to be in balance with its performance. Giving it, say, 12 GB of ram wouldn't make any difference in any titles (albeit an odd exception here or there) to gamers who want a steady fps.

If the memory were free, sure, it would be nice to have for casual gaming at 4k of titles that are drop dead gorgeous with their textures, and what not, but which can be enjoyed at significantly less than 60 fps.

And now that I think about it, free sync has suddenly made having only 6 GB of ram more of an issue. Gaming at around 40 fps can now be silky smooth to many, and they'd be pissed off by micro-stuttering. Lol, I've refuted myself, but I'll post this anyway.

I stand by how it would be neat to see the benchmark results, including frame times, of a GTX 980 Ti.

Legend

Even at 1080p rainbow six siege requires more than 6gb of vram for the optional ultra textures pack, and thats an oldish game. The future of high res textures will most likely require greater than 6gb. Main knock against the card in my mind.

Edit: side note. I wish more games were good at telling me when I'm exceeding vram.

About Us

Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!