TweakTown has put forth an article wherein they claim to have received info from industry insiders regarding the upcoming Vega 56's performance. Remember that Vega 56 is the slightly cut-down version of the flagship Vega 64, counting with 56 next-generation compute units (NGCUs) instead of Vega 64's, well, 64. This means that while the Vega 64 has the full complement of 4,096 Stream processors, 256 TMUs, 64 ROPs, and a 2048-bit wide 8 GB HBM2 memory pool offering 484 GB/s of bandwidth, Vega 56 makes do with 3,548 Stream processors,192 TMUs, 64 ROPs, the same 8 GB of HBM2 memory and a slightly lower memory bandwidth at 410 GB/s.

The Vega 56 has been announced to retail for about $399, or $499 with one of AMD's new (famous or infamous, depends on your mileage) Radeon Packs. The RX Vega 56 card was running on a system configured with an Intel Core i7-7700K @ 4.2GHz, 16 GB of DDR4-3000 MHz RAM, and Windows 10 at 2560 x 1440 resolution.

If these numbers ring true, this means NVIDIA's GTX 1070, whose average pricing stands at around $460, will have a much reduced value proposition compared to the RX Vega 56. The AMD contender (which did arrive a year after NVIDIA's Pascal-based cards) delivers around 20% better performance (at least in the admittedly sparse games line-up), while costing around 15% less in greenbacks. Coupled with a lower cost of entry for a FreeSync monitor, and the possibility for users to get even more value out of a particular Radeon Pack they're eyeing, this could potentially be a killer deal. However, I'd recommend you wait for independent, confirmed benchmarks and reviews in controlled environments. I dare to suggest you won't need to look much further than your favorite tech site on the internet for that, when the time comes.
Source:
TweakTown

Just posted this in VEGA discussion thread few seconds before this one :D

Sure it's still a 210W TDP card, but people don't realize this is max. If you fire up Radeon Chill, you'll drop consumption dramatically. It usually halves the consumption. And if you use Enhanced Sync, it'll be locked to screen refresh. Which means consumption will again drop, maybe not as significantly as with Chill, but still worth mentioning considering how everyone scares buyers with the max TDP numbers...

RejZoR said:Just posted this in VEGA discussion thread few seconds before this one :D

Sure it's still a 210W TDP card, but people don't realize this is max. If you fire up Radeon Chill, you'll drop consumption dramatically. It usually halves the consumption. And if you use Enhanced Sync, it'll be locked to screen refresh. Which means consumption will again drop, maybe not as significantly as with Chill, but still worth mentioning considering how everyone scares buyers with the max TDP numbers...

RejZoR said:Just posted this in VEGA discussion thread few seconds before this one :D

Sure it's still a 210W TDP card, but people don't realize this is max. If you fire up Radeon Chill, you'll drop consumption dramatically. It usually halves the consumption. And if you use Enhanced Sync, it'll be locked to screen refresh. Which means consumption will again drop, maybe not as significantly as with Chill, but still worth mentioning considering how everyone scares buyers with the max TDP numbers...

Battlefield and Doom numbers are pretty significant.

enhanced sync will use as much tdp as it can, always.
chill is not a perfect solution and won't really drop consumption that much for active gaming. plus, it is still working with a whitelist of games.

the numbers sound about right but should fit into 'trading blows with 1070' well enough. all the listed games have a tendency of running noticeably better on amd hardware.

Raevenlord said:If these numbers ring true, this means NVIDIA's GTX 1070, whose medium pricing stands at around $460, will have a much reduced value proposition compared to the RX Vega 56.

Do you even proofread? The word you're looking for is "median".

And you're also completely ignoring that GTX 1070's MSRP is $349, i.e. $50 less than Vega 56, which is extremely fair considering the (supposed) relative performance of these cards - in fact I'd say the 1070 still wins on price/performance if these Vega 56 numbers are truthful. So calling Vega 56 a "GTX 1070 killer" is laughable.

The price of GTX 1070 cards has only been pushed up because of the cryptomining BS. Vega 56 is unlikely to offer a better hashrate-per-watt than GTX 1070, which means GTX 1070 prices will stay high and they will continue to be bought in volume by miners, whereas Vega 56 will be bought in much smaller quantities by gamers. So NVIDIA still wins in terms of the pure numbers game, and therefore in revenue.

You can't blame NVIDIA that they made a great card that is in such high demand, regardless of the reason, that it commands a nearly 25% price premium over its MSRP.

londiste said:enhanced sync will use as much tdp as it can, always.
chill is not a perfect solution and won't really drop consumption that much for active gaming. plus, it is still working with a whitelist of games.

the numbers sound about right but should fit into 'trading blows with 1070' well enough. all the listed games have a tendency of running noticeably better on amd hardware.

Actually, that's not true. If Enhanced sync works anything like Fast V-Sync. I thought it's unlimited framerate, but it's not. At 144Hz, it stops at 144fps. It's just dropping frames that are out of sync and beyond refresh rate.

Assimilator said:Do you even proofread? The word you're looking for is "median".

And you're also completely ignoring that GTX 1070's MSRP is $349, i.e. $50 less than Vega 56, which is extremely fair considering the (supposed) relative performance of these cards - in fact I'd say the 1070 still wins on price/performance if these Vega 56 numbers are truthful. So calling Vega 56 a "GTX 1070 killer" is laughable.

The price of GTX 1070 cards has only been pushed up because of the cryptomining BS. Vega 56 is unlikely to offer a better hashrate-per-watt than GTX 1070, which means GTX 1070 prices will stay high and they will continue to be bought in volume by miners, whereas Vega 56 will be bought in much smaller quantities by gamers. So NVIDIA still wins in terms of the pure numbers game, and therefore in revenue.

You can't blame NVIDIA that they made a great card that is in such high demand, regardless of the reason, that it commands a nearly 25% price premium over its MSRP.

RejZoR said:Actually, that's not true. If Enhanced sync works anything like Fast V-Sync. I thought it's unlimited framerate, but it's not. At 144Hz, it stops at 144fps. It's just dropping frames that are out of sync and beyond refresh rate.

the premise should be exactly same as fastsync. it shows frames at screen's refresh rate but keeps rendering frames as fast as the card possibly can.

as i have a GTX 1070 .... (more 526$ for me than 460 .... when i bought it ) i suspected that a Vega 56 would be a slight upgrade tho, as i can sell my 1070 close to the initial price .... thanks Nvidia, i can probably plan on a Vega 64 ...

One just has to check with TPUs reviews and other trusted reviews to see that these numbers for the 1070 are all over the place. Until a proper review of Vega 56 is out these are meaningless. What we do know is that this Vega is competing with the 1070 for a MSRP of $400 so it better be better.

RejZoR said:Actually, that's not true. If Enhanced sync works anything like Fast V-Sync. I thought it's unlimited framerate, but it's not. At 144Hz, it stops at 144fps. It's just dropping frames that are out of sync and beyond refresh rate.

the frame rate will be capped to maximum monitor refresh rates. but the GPU will still work as hard as it could similar to v-sync being turn off. at least that's how i remember it.

renz496 said:that game is clearly favoring AMD hardware more. by a large margin too compared to respective nvidia card. just look how RX580 is competitive to nvidia reference GTX980ti in all resolution:

GreiverBlade said:oh that's great, and if true, just as i expected ...

as i have a GTX 1070 .... (more 526$ for me than 460 .... when i bought it ) i suspected that a Vega 56 would be a slight upgrade tho, as i can sell my 1070 close to the initial price .... thanks Nvidia, i can probably plan on a Vega 64 ...

ok it eats some more but ... well not that much important to me

too bad Vega pack will probably not be available where i live :laugh:

I think that when this will be released the price on the 1070 will decrease as well

the 1070 price has been inflated due to miner flocking to 1060 and 1070 when there is no longer AMD polaris GPU on the market they can buy. but at one point (i think it was a week or two before the price start going up) 1070 can be had as low as $330 at newegg. i was very surprise to see that back then. if the mining craze did not happen maybe we can see a lot of deal on 1070 below $400 mark right now.

uuuaaaaaa said:The RX 580 is competitive with the Fury X too, heck it beats it at 1080p.

nah. that's more because of Fury X design flaw i think. just look how fast it was at 4k. even ahead of 1070. to be honest it is because stuff like this i have difficulty to recommend people buying Fury X over RX480 if they have no intention to play above 1080p. it is no joke when you see 1060 beating Fury X at 1080p res:

renz496 said:nah. that's more because of Fury X design flaw i think. just look how fast it was at 4k. even ahead of 1070. to be honest it is because stuff like this i have difficulty to recommend people buying Fury X over RX480 if they have no intention to play above 1080p. it is no joke when you see 1060 beating Fury X at 1080p res:

We were discussing COD: IW not Anno 2205. Since the RX580 beats the both the 980ti and the Fury X at 1080p, wouldn't that make the 980ti flawed too? At 4k the Fury X easily beats the 980ti and the 1070 in COD IW tho and the RX580 is still ahead of the 980ti. These kind of discussions are almost pointless, in the end it comes to software and driver optimization, look at DOOM vulkan on AMD cards, specially the Fury X and how close it is to the GTX 1080.

uuuaaaaaa said:We were discussing COD: IW not Anno 2205. Since the RX580 beats the both the 980ti and the Fury X at 1080p, wouldn't that make the 980ti flawed too? At 4k the Fury X easily beats the 980ti and the 1070 in COD IW tho and the RX580 is still ahead of the 980ti. These kind of discussions are almost pointless, in the end it comes to software and driver optimization, look at DOOM vulkan on AMD cards, specially the Fury X and how close it is to the GTX 1080.

i was pointing it out that such behavior also exist on other game for Fury X since you said RX580 beat Fury X in COD: IW at 1080p. the RX580 beat 980ti in all resolution is because of the said game favoring AMD architecture more. but realistically if you look at raw performance alone RX580 should not beat Fury X even at 1080p. if you look at anno bench at 4k the Fury X performance is closer with 980ti but at 1080p 980ti is significantly ahead to the point even 1060 capable of beating fury x at that res. that result is from RX580 review and Fury X has been on the market since 2015. anno 2205 was released to the market by the end of 2015 so i doubt AMD still have no proper optimization for the game in 2017. if anything that should point to something is not right with Fiji that can hold back it's performance on lower resolution.

Ok, how in bloody hell that even works? If AMD has small market share, why would anyone bother specializing their engines favoriting AMD? Just pointing out the obvious. You know, maybe AMD is just good at it? Why can't that be a possibility? Why that only applies when NVIDIA is good at it?