Thanks for the effort, and this review was worth the wait. I'll be surprised if there aren't some driver developments that close the gap on some of the games, though AMD may not spend the time to optimize for these specifically. Most games, the difference is so minimal, that with freesync monitors, there would be no gameplay difference. Nice solid turn out.

This makes it a viable gaming card, but I would love to see some good productivity numbers, because that is where I think it will shine at this price point. Some good productivity, that can game well, will fit many people's needs.

Also, I was having similar black screen problems on my TV and I discovered it was due to some sort of interference.

I have since separated the cables for my TV from the rest (using twist ties) and that fixed it.

Click to expand...

I tried two HDMI-cables on the TV which work fine on my NAS and BR-Player connected by those. The second shorter cable (2m vs 1m) did not show the issue on a short gaming session, but as the problem occurs not every time its hard to tell if it is really fixed. On the 26" monitor I am using a 3m DVI-Cable with an adapter on the HDMI port of the graphics card. That cable without the adapter had never issues with the AMD HD7950 which was installed before in the pc. I also used the cable with the adapter on some gaming sessions on a laptop with nvidia graphics card and that worked also.

I almost certain this is card related, I also have to say I have undervolted the card maybe that could cause it also but I also had the card running at stock one time when that happened.

Was really hoping that the HBM2, and 16Gb of it would have made more of a difference... maybe it did and helped it keep up? Makes it appear that the memory bandwidth isn't as important to performance as we might think.

Was really hoping that the HBM2, and 16Gb of it would have made more of a difference... maybe it did and helped it keep up? Makes it appear that the memory bandwidth isn't as important to performance as we might think.

Click to expand...

I think it did help. That is why the card is competing with a 2080/1080Ti versus the performance of Vega 64, which is only 1080 quality.

Was really hoping that the HBM2, and 16Gb of it would have made more of a difference... maybe it did and helped it keep up? Makes it appear that the memory bandwidth isn't as important to performance as we might think.

Click to expand...

au contraire, its thanks to HBM2 for the performance increase. But even then, bandwidth can only get you so far.

Interestingly, some games that were choppy before with Crossfire are smooth now (like Prey or Far Cry 5). This wasn't a performance issue necessarily, I was getting 60 fps+ in either case, but on VII the experience is smoother.

Maybe microstutter or some sort of conflict with FreeSync? I don't know but games are running much better on a single VII versus Crossfire before.

Interestingly, some games that were choppy before with Crossfire are smooth now (like Prey or Far Cry 5). This wasn't a performance issue necessarily, I was getting 60 fps+ in either case, but on VII the experience is smoother.

Maybe microstutter or some sort of conflict with FreeSync? I don't know but games are running much better on a single VII versus Crossfire before.

I think I'm done with multi-GPU.

Click to expand...

That's not unsurprising. mGPU has, in my experience, an inherent issue with microstutter, some games worse, some games better. I'm in fact seeing smoother performance in some games e.g. AC Odyssey than on my 1080 ti, my theory being the HBM2 is loading textures much quicker than the 1080 ti's ddr6 (maybe this is why the VII is showing higher minimum frames in some games?).

"I think I'm done with mGPU" is what I've said every gen since my 6800 GTs, but I keep going back for the hurt. This gen, support is absolutely not there, so I'm done 100%.

What sort of undervolt and overclock are you getting? I've settled on two stable profiles in Wattman at this point:

For the Radeon VII to arrive later and be slower and cost the same as a 2080, quite frankly it's disappointing.

That said, it is a viable alternative choice in the 4k GPU arena.

The 16GB of video memory looks like an absolute waste for gamers though.

AMD should have shipped this with 12GB of VRAM and priced it at $599 and it would have become a much better choice for gamers.

Click to expand...

that's not how HBM works, lol. you can't just say oh i'm going to pull 4GB of ram off this and make a completely new gpu.. your option is either 4GB, 8GB, 16GB, 32GB, etc, etc. there's no option to do anything in between without making a completely new HBM module which would be pointless. just be glad the card exists, if Nvidia had properly priced their RTX series we'd still be waiting for Navi. either way it's clearly working because we're starting seeing some RTX 2080's being discounted below MSRP.

For the Radeon VII to arrive later and be slower and cost the same as a 2080, quite frankly it's disappointing.

That said, it is a viable alternative choice in the 4k GPU arena.

The 16GB of video memory looks like an absolute waste for gamers though.

AMD should have shipped this with 12GB of VRAM and priced it at $599 and it would have become a much better choice for gamers.

Click to expand...

It amazes me how members of this site do not seem to understand the way HBM works...If AMD were to use 12GB, they would need to be able to buy 4 stacks of 3GB of HBM. Neither Samsung nor Hynix offer this option. So AMD had the choice of going with 2 stacks of 4GB (like they did with VEGA 56/64) and 512GB/sec of memory bandwidth (which is not enough to feed the CUs in the current GCN design) or the 4x4GB stacks they did (which gives them the needed 1TB/sec of bandwidth.

I have done extensive testing in regards to the effects of HBM scaling on performance with both a VEGA 56 and 64 that would boost to over 1750Mhz (under water), and they NEED the bandwidth. Even 583 GB/sec which was thanks to 205Mhz OC on my HBM, was still holding the GPU back. AMD needed the bandwidth.

The interesting option (if it exists and could be sourced cheaply enough), would be using 4x 2GB stacks of HBM2 (if they exist in volume). That would give the card bandwidth and be somewhat cheaper, but not enough IMO. Nvidia is already getting flack for only giving the 2080 8GB of ram, and some games are punishing the cards at high res.

For the Radeon VII to arrive later and be slower and cost the same as a 2080, quite frankly it's disappointing.

That said, it is a viable alternative choice in the 4k GPU arena.

The 16GB of video memory looks like an absolute waste for gamers though.

AMD should have shipped this with 12GB of VRAM and priced it at $599 and it would have become a much better choice for gamers.

Click to expand...

In a hypothetical world where 3GB stacks of HBM existed, yes, that would have been an interesting option. I've been comparing my VII to my 1080 ti for the past week. As it is, I'm leaning to VII as being the better option. It's average FPS is about the same, but the minimums seem higher/smoother to me on the VII. The VII has been, for what it's worth, more fun to tweak.

As for RTX (2080) vs more VRAM (VII), I feel games that require more than 8GB at high res is an absolute eventuality (a few already out there), whereas RTX support is still an unknown (games that have announced to support it for next year number... 2-3?). I've been burned a few times buying a new card with limited VRAM e.ge.g. 4870 512mb, 780 ti 3gb; I've never lost out waiting to see if the world supports a new feature: this normally takes years to pick uo steam (and very often pitter out).

All things considered, the choice between 2080 and VII (and 1080 ti, if you can find one) balances on a gnat's ass. It's all about pricing and extras.

Yet again won’t be buying an AMD card. Haven’t had one since 7970 I think. Getting 2+ year old performance at 2 year old pricing with only a touch of extra ram doesn’t do it for me.

Neither does RTX 2080 Ti at 1300 for a 3 fan decent solution for 600 extra compared to what I paid for 1080 Ti to get 28% performance increase. Guess I am sitting this round out until I see some real value.

Good review but don’t understand why give it a silver even. Should be like a bronze but I guess that is so subjective it doesn’t matter.

As of right now this card is 750 Euro in Europe and the RTX 2080 starts around 670

Click to expand...

Right now in Denmark the cheapest 2080 with a 2 fan cooler cost a little less than these new AMD cards, the most expensive 2080 card ( still air cooled ) cost a additional 570 USD so almost 2X that of the cheapest 2080 and something like +70% over the new AMD card
So yeah with a little cheaper ( few USD ) 2080 card you also get a better cooler ( i assume ) but only 1/2 the RAM

I am looking forward to see what the partners can do with custom coolers, and maybe amped frequencies.

Right now in Denmark the cheapest 2080 with a 2 fan cooler cost a little less than these new AMD cards, the most expensive 2080 card ( still air cooled ) cost a additional 570 USD so almost 2X that of the cheapest 2080 and something like +70% over the new AMD card
So yeah with a little cheaper ( few USD ) 2080 card you also get a better cooler ( i assume ) but only 1/2 the RAM

I am looking forward to see what the partners can do with custom coolers, and maybe amped frequencies.

Click to expand...

Honestly, unless you're into more content creation/compute workloads, the extra RAM is not of any interest whatsoever. For gaming at least, the 2080 is perfectly well equipped.

Fair enough though - I should have said 'Germany' and not 'Europe'!

I'm waiting out this generation, even though my GTX 1080 recently died... Still have plenty of games to finish on the consoles anyway, and I'll come back to PC gaming when there's a decent, fairly-priced, spiritual successor to the 1080/Ti.

I agree on the RAM, and i am a little into working with video, not really enough for 16GB to be a absolute must have, but if i end up with this one i would not mind.
I do more video than gaming, as the games i like seen to not be made anymore, and i would like for video to pick up even more.
But in the hope of NAVI i am not picking up a new GFX card before end July or early August, which will be a nice birthday present for myself as i have the 53 of those coming in the end of August.

Meanwhile i am compiling a shopping list in regard to new case, more cooling components, this look to be at least as expensive as one of these GFX cards.
Think i will pick up case soon so i can get going on design / brainstorming and fabrication that should be fairly cheap, and give me some time to work with my hands.

The card is around 11 inches length, the same size as the reference Vega 64.

Click to expand...

Around 11 inches...what's the tolerance on "around"?

My Sapphire Vega64 is 10.63" or 270mm.
I've seen one source say VII is 305mm long.
Three others say 279mm.
Yet another claims 267mm, also states that as the length of a Vega64.
Did not find any info on AMD.com.

Hi Kyle,
thanks for the review. One question about your problems on the x399 mainboard.
I am having a Radeon VII running on an x399 Taichi board and I have from time to time signal loss. When it occurs it is periodically until I restart the PC.
It seems not related to video card temperature or usage. Is that the problem you experienced?

What happens is that the screen goes black sometimes showing a grey screen with color noise briefly.
I have recorded that behavior here:

I am on Adrenalin 19.2.2 but had that also on 19.2.1, that problem occurs on two different monitors who were connected via HDMI.

An answer would be really appreciated because I am not sure if this is a defective card or that x399 instability issue the driver mentioned.

Click to expand...

I think I solved my problem, writing this in case someone else has these issues.

First I think my RAM OC (2400MHz stock, 2933MHz OC on Samsung B Dual Rank Quad Channel) was effecting the PCI-E slot somehow.
I went back to stock settings and the problems at my 26" 1080p monitor connected via DVI seems to be gone for now.
For my 4k TV (4k HDR 60Hz signal) I switched to a slightly shorter HDMI cable (1m vs. 1.5m) which seems to solve the issues I had there.

Also these problem seemed to relate somehow to HDCP because disabling HDCP also seemed to fix the issues.