Share this post

Link to post

Share on other sites

Hoping that 7nm cards will be good. I expect AMD to release them late next year or perhaps the year after that as TSMC is already kicking off fabs for them, doubt Nvidia will be doing anything with 7nm in that timespan though.

In addition to the massive core count increases and price per core decreases, instructions per clock is also improving.

Also, I predicted 16 cores would land at $500. But 16 cores at $450, 12 cores at $300? Hot damn.

The two models with the "G" would be the ones with integrated graphics. Previously only some quad-core Ryzen desktop (and mobile) CPUs had integrated graphics.

The most exciting thing about this lineup is that quad-core isn't even a consideration. Ryzen mobile will probably have quad-core and may even be limited to 4 cores, but it's just gone on the desktop front. Big contrast from the years of being stuck at quad-core on the Intel side (and you can treat the old 4 module, 8-"core" AMD Bulldozer CPUs as quad-cores).

While it seems unusual for the clock speeds to be going up as you travel up the lineup, apparently the 6-8 core CPUs use a single 8-core chiplet, whereas the 12-16 cores use two 8-core chiplets. So the area of the chips containing CPU cores is physically larger and the thermals could be better. The others can be explained by TDP differences or binning/price differentiation.

The RTX 2000 cards were prohibitively expensive at launch, so this is a way of getting Turing out cheaper. And since nobody really needs these ray tracing cores in 2018 since it will take 2+ years before games regularly feature raytracing, and subsequent generations of ray-tracing GPUs might actually be able to deliver a nice framerate, now you have a card you might actually want to buy.

It may have been a good thing for AMD to not be first to add raytracing cores. It is overpriced hype for now, and they can respond within the next 2 years. But AMD will have to follow suit and pursue real-time raytracing.

Link to post

Share on other sites

Nothing about 16-core Ryzen yet, just a tease of 8-core. However, AMD has announced the Radeon VII, a GPU with (supposedly) around the same price and performance as Nvidia's GTX 2080 ($700). I'm not sure that is the best move.

I encourage you to try this benchmark on your machines. I was really surprised by results. On GTX 1060, which is really not fit for raytracing, I got 32 FPS for Ultra and 45 fps for Very High in 1080p.

AMD's new 24-core and 32-core Threadrippers destroy just about everything. Not only that, but a 64-core, 128-thread version is confirmed for early next year (and a 48-core is probable).

It will be interesting to see if AAA games start to use all these cores. The next-gen PlayStation 5 and Xbox will have 8 legit Zen 2 cores. 12-core 3900X and 16-core 3950X are "mainstream" chips, so it wouldn't be a surprise to see more games use up to 16 cores (but not as a requirement since streamers are likely to use some of those cores in the background). And if your game has parallelized enough logic to be able to use 12 or 16 cores, why not 24, 32, 48, or 64 cores? Threadripper 3 fixes up the architecture so that the "Game Mode" used on the previous chips shouldn't be needed anymore. The new sTRX4 socket was necessary to support those high core counts without starving some of the chiplets of bandwidth.

Usage of 8 cores is going to be standard due to the new consoles. They might even have additional ARM cores for background tasks. So any newer (resource intensive) game should be able to utilize 8 cores. I heard that Ashes of the Singularity can use 12 cores. 16+, who knows? I want to see games that can scale to use 32, 64, or even 128 x86 cores, when appropriate.