We’ve seen the architecture. We’ve seen the teasers. We’ve seen the Frontier. And we’ve seen the specifications. Now the end game for AMD’s Radeon RX Vega release is finally upon us: the actual launch of the hardware. Today is AMD’s moment to shine, as for the first time in over a year, they are back in the high-end video card market. And whether their drip feeding marketing strategy has ultimately succeeded in building up consumer hype or burnt everyone out prematurely, I think it’s safe to say that everyone is eager to see what AMD can do with their best foot forward on the GPU front.

Launching today is the AMD Radeon RX Vega 64, or just Vega 64 for short. Based on a fully enabled Vega 10 GPU, the Vega 64 will come in two physical variants: air cooled and liquid cooled. The air cooled card is your traditional blower-based design, and depending on the specific SKU, is either available in AMD’s traditional RX-style shroud, or a brushed-aluminum shroud for the aptly named Limited Edition.

Meanwhile the Vega 64 Liquid Cooled card is larger, more powerful, and more power hungry, utilizing a Radeon R9 Fury X-style external radiator as part of a closed loop liquid cooling setup in order to maximize cooling performance, and in turn clockspeeds. You actually won’t see AMD playing this card up too much – AMD considers the air cooled Vega 64 to be their baseline – but for gamers who seek the best Vega possible, AMD has put together quite a stunner.

Also having its embargo lifted today, but not launching until August 28th, is the cut-down AMD Radeon RX Vega 56. This card features lower clockspeeds and fewer enabled CUs – 56 out of 64, appropriately enough – however it also features lower power consumption and a lower price to match. Interestingly enough, going into today’s release of the Vega 64, it’s the Vega 56 that AMD has put the bulk of their marketing muscle behind.

Between these SKUs, AMD is looking to take on NVIDIA’s longstanding gaming champions, the GeForce GTX 1080 and the GeForce GTX 1070. In both performance and pricing, AMD expects to be able to bring NVIDIA’s cards to a draw, if not pulling out a victory for Team Red. This means we’ll see the $500 Vega 64 set against the GTX 1080, while the $400 Vega 56 goes up against the GTX 1070. At the same time however, the dark specter of cryptocurrency mining hangs over the gaming video card market, threatening to disrupt pricing, availability, and the best-laid plans of vendors and consumers alike. Suffice it to say, this is a launch like no other in a time like no other.

Overall it has been an interesting past year and a half to say the least. With a finite capacity to design chips, AMD’s decision to focus on the mid-range market with the Polaris series meant that the company effectively ceded the high-end video card market to NVIDIA once the latter’s GeForce GTX 1080 and GTX 1070 launched. This has meant that for the past 15 months, NVIDIA has had free run of the high-end market. Meanwhile AMD’s efforts to focus on the mid-range market to win back market share meant that AMD initially got the jump on NVIDIA in this market by releasing Polaris ahead of NVIDIA’s answer, and their market share has recovered some. However it’s a constant fight against the dominating NVIDIA, and one that’s been made harder by essentially being invisible to the few high-end buyers and the many window shoppers. That is a problem that ends today with the launch of the Vega 64.

I’d like to say that today’s launch is AMD landing a decisive blow in the video card marketplace, but the truth of the matter is that while AMD PR puts on their best face, there are signs that behind the scenes things are more chaotic than anyone would care for. Vega video cards were originally supposed to be out in the first-half of this year, and while AMD technically made that with the launch of the Vega Frontier Edition cards, it’s just that: a technicality. It was certainly not the launch that anyone was expecting at the start of 2017, especially since some of Vega’s new architectural functionality wasn’t even enabled at the time.

More recently, AMD’s focus on product promotion and on product sampling has been erratic. We’ve only had the Vega 64 since Thursday, giving us less than 4 days to completely evaluate the thing. Adding to the chaos, Thursday evening AMD informed us that we’d receive the Vega 56 on Friday, and encouraging us to focus on that instead. The reasoning behind this is complex – I don’t think AMD knew if it could have Vega 56 samples ready, for a start – but ultimately boils down to AMD wanting to put their best foot forward. And right now, the company believes that the Vega 56 will do better against the GTX 1070 than the Vega 64 will do against the GTX 1080.

Regardless, it means that we’ve only had a very limited amount of time to evaluate the performance and architectural aspects of AMD’s new cards, and even less time to write about them. Never mind chasing down interesting odds & ends. So while this is a full review of the Vega 64 and Vega 56, there’s some further investigating left to do once we recover from this blitz of a weekend and get our bearings back.

So without further ado, let’s dig into AMD return to the high-end market with their Vega architecture, Vega 10 GPU, and the Vega 64 & Vega 56 video cards.

Various caches and internal buffers; on die memory is normally SRAM because it's several times faster than DRAM. (DRAM is several times denser since it only uses 1 transistor/bit vs the 4(?) for SRAM; which is why its used for main memory where total capacity is more important - and where the data bus is the main latency source anyway.) I'd be curious what the breakdown is since only 4MB if it's in the L2 cache.Reply

A lot of it is going to be in the low level L1 caches and stuff local to the shaders -- there are a lot of shaders, so it will add up fast. GCN 5 does have double L2 cache, at least according to this article, 4MB vs 2MB. AMD says there is a total of over 45MB of SRAM on there, which is pretty impressive for a GPU!Reply

Another Fury X fail. You'd have to be a hard core AMD fan to buy this over a GTX 1080, and that's not even taking into consideration the horrid power use compared to the 1080. Isn't that what AMD fans tell us is so important when comparing Ryzen to i7 CPUs in core/watt performance? Amazingly they are silent here. Reply