AMD unveils next-gen Radeon VII, launches February 7 for $699

"The Radeon VII boasts 60 compute units and 3,840 stream processors running at up to 1.8GHz. It also has a generous frame buffer—16GB of high bandwidth memory (HBM2) delivering 1TB/s of memory bandwidth. It's not really a surprise that AMD is sticking with HBM2 instead of GDDR6, as that's been the case with every Vega model so far.

AMD's reference design follows Nvidia in ditching the blower format, going for a triple fan design instead. That's probably for the best, as the former blowers could get incredibly loud at higher fan speeds. Plus, AMD apparently still needs to cool a 295W chip.

AMD says its 7nm architecture delivers 25 percent faster performance than the previous model, while consuming the same amount of power."

We'll see when more trustworthy benchmarks arrive 300W for 7 nm will be disappointing in my book - that's much worse performance per W than the 2080 Ti. But drivers are young, and if not for anything else, I hope AMD can help reduce Nvidia's RTX prices.

What they really need to do is have the thing cheaper. Either that or have a better performing card for the same price. Oh well, might make Nvidia drop the price of their RTX cards I guess with a bit of luck.

"This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

Once Nvidia does 7nm, RTX will (definitely) be the go to card due to 2nd gen (could be a doubling of performance due to 7nm + gen 2 update). That's when I'll making my the move from 1080ti. Just hope the price isn't double...LOL!!!

The next generation of RTX GPUs will be cheaper because the new tech in the current 20xx GPUs will be cheaper to manufacture. We only saw a big leap in price for the 20xx GPUs because new tech is expensive.

"This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

Worth remembering is also that performance wise (in non-RTX games) the 2080 is similar to the 1080 Ti. So the time from March 10 2017 to February 7 2019 is the time AMD needed to catch up with the 1080 Ti. Unless Radeon VII can impress in certain important aspects, presenting a card with 1080 Ti performance Feb 7, which uses even more power than a 1080 Ti, may not put a lot of wind in their sails. The reference 2080 uses 215W and 12 nm fabrication process while 1080 Ti uses 250W and 16nm - if Radeon VII really needs 295W for 7 nm process, then that doesn't seem like a very power efficient architecture (but then again, competing against Nvidia may be incredibly hard). At least nice to see that AMD has made a new card with high-end performance that may provide some competition.

I am glad to see I am not the only one dissapointed (but not surprised) by the new radeon 7

its performance is probably going to be on a par with the RTX 2080 -
beating in some, losing in others and a tie in most - in traditional
games

price is looking to be on a par with RTX 2080

BUT...... No RTX, No DLSS and it still uses a fccuk ton of power and therefore needs a chunky PSU......

The only thing it has over the RTX 2080 is the 16gb of ram... which may be a big deal, or may not be.....

over all, all biases aside i honestly cant see a reason to buy this over
an RTX 2080 for gaming... the ace in its sleeve may have been Freesync
but now nvidia is looking to support Async (and therefore freesync)
then that advantage may be gone now too.

I know more performance was never going to be on the cards, AMD seem to
have gone as all in on the performance as they could..... and therefore
that is why the power requirements is looking to still be so high.

but for gaming i cant help but feel they would have been better going
for GDDR6 and offering 2 versions 8gb and 16gb. From what i have read
about HBM2 and its prices, had they done what i suggested above, a 16GB
GDDR 6 card could have been $100 cheaper, and an 8GB one maybe even $150
- $200 less.

still..... it IS a step up from Vega 64 and may help reign in RTX 2080 prices... but i am not that hopeful.

An 8GB GDDR6 Radeon 7 with similar RTX 2080 performance but for $499 /
£450-£499 or 16gb for $549 / £500-£550 is a card i could see making
nvidia pause!.

different technology......... nvidia have been investing a lot more money into developing their gpu tech over the last years and are simply more advanced than AMD. AMD have less cash to invest and could not directly compete on 2 fronts.... so they chose to work mostly on their cpus and take on intel..... it seems to have worked. the ryzen chips are great (their apus damn fine too and this has meant they have the contracts for both the PS5 and the new xbox).

the rumours are now however they DO have a lot more cash to invest in gpu R&D, and that this new radeon is a stop gap - much like the 590 to keep something on the table to keep them even remotely relevant because it takes time to go from R&D to having a viable competing card on the table.

also they are not a total washout. they still have advantages, just not really for gaming. Also the parts are salvaged from their really high end chips so they had to do something with them.

Also we have to remember the high end gaming gpu is pretty niche. AMD provide chips for both MS and sony current gen consoles and as i said the new ryzen 7 APUs look fab..... so its not like they are not competing elsewhere in gaming... just not at the very high end.

the RX 580 with 2 games bundle for £200 or RX590 with 3 games bundle for £250 is pretty good for mid range gamers who just wantto game at 1080p.

it was not always this way..... at the time the nvidia GTX 480 or "Fermi" as it was known was comparatively high power usage and ran really hot. it was not till the architecture matured in its refreshed form that it really took off.

Honestly though they could've went with 8GB version and match their Vega brother. HBM2 is costly and I'm sure at least 75$ of that price tag was for that extra 8GB. At least for gaming anyways. It just doesn't make sense to have 16GB of VRAM on this card for that alone. Granted - I'm sure they design this card for more uses than a gaming card so I understand what they did and why - but as with most products - it needs to make sense for that market that is looking to buy it.

Right now R7 doesn't make sense here for gaming alone and could've been price a bit better if they would've made two versions of the card for sell and said that in their keynote. One for $599 and one at 699$ would've made a bigger splash.

At this point I think calling it a gaming card alone seems a bit silly. Instead - I think they should've said it was more of a workstation card design with high memory usage task in mind. This way it make more sense and VERY worth vs other workstation cards such as the Titan coming in at 1k$+.

From there they could've went into more detail plain for later in the year to release new gaming GPUs that will be design for price to performance ratios without the needed RTX stuff in mind. Unfortunately - AMD is behind in GPU land right now. Even with the jump to 7nm their current architure is just NOT good enough anymore and doesn't scale well. They need to revamp their cards and that comes with big cost as well. AMD was smart focusing on one side of the business over the other and it paid off in this case, but it's going to hurt them big in the other. I hope Ryzen 3 release will push them even more forward though with cash so their next line up of GPUs will make sense. I'm actually looking forward to building AMD computer and double my performance over my current I7 3770.

RTX features aside - if you are in the market for a new GPU - I would look at the NV 2060 or 2070 (or anything in the 10s) or the 580 and 590 from AMD. This way you get the best bang for the buck if anything.

@bigmike20vtThey couldn't. GDDR6 is actually too slow for the card in this case and as a result they can't go back. Plus - any memory changes would take time and money to refit for their needs. Not to say it's not possible - no - I think it has to be with the fact they need to stay relevant in the market - so changes right now would've been bad in this case. I say you are correct though and that they know HMB2 is just too costly and as a result - their Navi line up will feature GDDR6 instead later on in year. The only downside to that is the fact that 20s cards will also be coming down in price - and NV following shortly after AMD for their 7nm as well.

If anything - it looks like we could see close to 30-60% performance increase from both AMD and NV for their 7nm lineups. That really good for VR because that would allow 4k by 4k screens over eye tracking no problem for example for gen 3 of VR. Flat gaming will see 144 HZ become the new refresh rate. 4k gaming on the other will also see a increase too by 15-30%.

Sadly it will equate to about the same in GBP as Dollars - 20% VAT will eat up the £/$ exchange rate, but still an expensive & power hungry card - think I'll wait for the next generation (or hope Nvidia drop their base price on the 2080ti )

So all this really says is everyone wanting to play PC games had better expect it to become the norm to just pay as much as a fucking entire game console to just get the GPU period. Going forward, Sub $300 cards will probably become a thing of the past. I can't afford these damn things anymore. I will certainly buy these if they fall off a truck, I don't care if they straight up told me it's stolen, if it's $250 or less, I'd buy it. These things are simply to god damn expensive now.

The next generation of RTX GPUs will be cheaper because the new tech in the current 20xx GPUs will be cheaper to manufacture. We only saw a big leap in price for the 20xx GPUs because new tech is expensive.

I think Nvidia will opt out of giving a discounted price to the consumer and will opt in to increasing their profit margins. Mark my words....

so on another forum i have just been told that what i suggested above is a non starter.

apparently because these are essentially failed vega 20 chips things like the memory is already designed to be 16gb, so half of the HBM2 cant simply be taken out.

and also, for the same reason as above, and because it would take a significant retooling anyway (expensive) to put GDDR 6 on it it is not viable to change the memory type either, not for a card which is going to have a very short shelflife as a stopgap.

edit and @Mradr also pretty much said some of the same here too i just noticed

So all this really says is everyone wanting to play PC games had better expect it to become the norm to just pay as much as a fucking entire game console to just get the GPU period. Going forward, Sub $300 cards will probably become a thing of the past. I can't afford these damn things anymore. I will certainly buy these if they fall off a truck, I don't care if they straight up told me it's stolen, if it's $250 or less, I'd buy it. These things are simply to god damn expensive now.

I wouldn't say that. Right now - everything is just mess up in the technology world. Delays in the next shrink cause a bunch of problems then on top of that you had the mining craze and the memory shortages. I say they will go back to normal for both NV and AMD by end of this year in terms of pricing. Well - price wise - things will have to slowly increase as well as innovation becomes harder to do too. We did take advantage of the fact that things kept getting smaller and faster - now we're hitting speed bumps. Either way prices will fall back down because of the fact memory needs should fall as most people are not upgrading phones every year anymore. Plus - it looks like the mining craze will not recover either (still be around) but will not be as bit as it once was. 7nm yields will also improve over time too.

I say don't worry about it right now. I'm still on a Fury X card and it has done me well for a while now. I can play most games with reasonable settings - so long as that is what you are looking for - then that is enough for the most part.

as again @Mradr touches on.. the upside is upgrades are really not needed so often. my mate still games on a gtx 780, and this includes vr on an oculus rift.

the only reason i updated my gtx 980 was because i got a new 4k tv...... 4k is a really big ask..... for for 1080p and the rift my gtx 980 would have seen out the entire oculus rift generation. now we have ray tracing as well.

sure pcs did not cost as much to upgrade back in the day****... but then back in the day you could play the newest games at full detail for 6 - 12 months if you were lucky and then you had to be looking for the next upgrade.

next generation i hope that we will be looking at 4k 60fps WITH raytracing... and if we get that, even if it costs £1000 to get it, i could live with that because chances are i wont need to upgrade for 5 years after it. and that is proper at the sharp end.

again the majority of gamers are happy at 1080p... with this in mind hopefully the next generation £300 gpu will be more than capable of this again for many years.

my cpu......... that is an i7 5820k... it over clocks well but it is 3.5 years old now and i fully expect it to still be fine for the next 3.5 years (i may need another 16gb of ram that is all)

so i would add in that as well as all the stuff mradr says we also stepping up to a new huge resolution (4k) and TBH i do not see the need to go higher than that any time soon if ever for gaming on a screen..... and implementing the holy grail technology that many gamers and developers have been fantasising over for years .. real time ray tracing.... the development to make that tech costs proper money that has to come from somewhere...

but again now its here and essentially working, i do not think it will change much for a few years just get faster and cheaper to produce.

**** I am not even that sure how true that is. my ibm 486 DX3 75 with ... 4mb i think of ram, 420mb hdd, 2x cd rom and sound blaster 16 with 15 in monitor cost over £1400.... that would have been in 1994.

taking into account inflation (real world cost of living) that must be £3000 - £4000 in today's money and truth be told, even back then it wasnt that good there were far better pcs out there

Yea, right now RT needs to show in software usage. The other problem - AMD also needs to pick up RT if it's going to worth any salt. NV alone supporting it will not be enough for software devs to really add it into their games. This also means NV needs to be more open with their RT software if they wish for both consoles and AMD to pick it up too. I know they are in for the money - but if you want to sell a feature - you need to sell it to your competition first so you look better at the end of the day with the higher quality of said feature.

Right now 4k at 60FPS RT is already a thing though. The problem is people have higher refresh monitors in the upper 144 HZ range. Even at 1080p - they're unable to hit that target FPS with RT on. It's not so much a resolution problem as it is just the software way of doing RT isn't mature enough. That should come with time though. The fact we saw such a big performance leap already from BF5 last update shows there is still a ton of room for improvement. Actually - someone over on reddit said that they're not even using the TR cores to their max performance. This makes me wonder if something else is going on.

So all this really says is everyone wanting to play PC games had better expect it to become the norm to just pay as much as a fucking entire game console to just get the GPU period. Going forward, Sub $300 cards will probably become a thing of the past. I can't afford these damn things anymore. I will certainly buy these if they fall off a truck, I don't care if they straight up told me it's stolen, if it's $250 or less, I'd buy it. These things are simply to god damn expensive now.

I wouldn't say that. Right now - everything is just mess up in the technology world. Delays in the next shrink cause a bunch of problems then on top of that you had the mining craze and the memory shortages. I say they will go back to normal for both NV and AMD by end of this year in terms of pricing. Well - price wise - things will have to slowly increase as well as innovation becomes harder to do too. We did take advantage of the fact that things kept getting smaller and faster - now we're hitting speed bumps. Either way prices will fall back down because of the fact memory needs should fall as most people are not upgrading phones every year anymore. Plus - it looks like the mining craze will not recover either (still be around) but will not be as bit as it once was. 7nm yields will also improve over time too.

I say don't worry about it right now. I'm still on a Fury X card and it has done me well for a while now. I can play most games with reasonable settings - so long as that is what you are looking for - then that is enough for the most part.

I'm on a GTX 1060 6gb and I want better performance. Now I should wait another year? I already waited a couple years and the 1080ti I was going to get is still a rip off. I was going to get a 1070ti which is still a rip off considering it's age, that damn card shouldn't cost $500 or more. Now I'm considering the RTX 2070 because it's like $50 to $80 more than a 1070ti, but WTF happened here? Top end cards used to be in the $300 range at one point, now they're absolutely insane with their pricing.

I'm on a GTX 1060 6gb and I want better performance. Now I should wait another year? I already waited a couple years and the 1080ti I was going to get is still a rip off. I was going to get a 1070ti which is still a rip off considering it's age, that damn card shouldn't cost $500 or more. Now I'm considering the RTX 2070 because it's like $50 to $80 more than a 1070ti, but WTF happened here? Top end cards used to be in the $300 range at one point, now they're absolutely insane with their pricing.

More memory - more cores - more everything. If we kept at the same level of hardware - sure they could get the price down by a lot - but would you really want to have 512 GB of memory in the age of 4k? To have said 4k you need to increase the amount of memory that that means more cost. Simple supply and demand there. Even if you take scaling into factor - you still be the same in the fact it takes a few memory chips to hit said memory levels.

Then there is inflation going on too. That same 300$ from before - is still the same 400-500$ card we see today.

Right now - the 20s cards the only cards I feel are totally outrageous - but I have a feeling it's because they are going to be short lived with new cards coming out by end of the year. Unless you really need that performance - I say wait. IF you do - then 20s is your next jump up - but that is the nature of of the x060 line up. If anything - say in the line of the 10s by getting a 1080ti used from ebay or something. That way you get the best bang for your buck. I seen 1080s going for around 300$ there and tis around 400-500$.

Honestly - if you think this is bad - wait till 8k becomes a thing and all cards start at 1k$+ because of all the requirement on memory and bandwidth. Granted - HBM will shine here and save the day just a bit - but prices will be well higher than they are in the age of 4k. 4060 (making up numbers here) will be 40% higher than they are now. Meaning we could see prices start at 649.99$ just for that card alone including some inflation. Then again that be like a 100+% performance boost too.

When buying a GPU, try to aim for the High-End one, game developers always try to aim for the low to mid-tier cards as a priority. Meaning that if you invest in a higher-end GPU, you won't really have to upgrade for 4 or so years with each cycle, and you can always play your games at highest settings. My 1080 is 3 years old, I can still play even the newest titles at everything ULTRA at 1080p at 80fps or more. I will upgrade next year.

If you skip a dominos pizza every 2 weeks you will save up enough dosh for a high end card every 4 years, with change

I'm the same way - I upgrade when things start to double/triple in performance from my last card/cpu. This way I get the best bang for my buck for that time period. That makes for my CPU we finally just double it's performance with the 9000s chips - but the Ryzen 3 chips are said to be the same performance with more cores for the same price. That is well worth the wait if true.

The next generation of RTX GPUs will be cheaper because the new tech in the current 20xx GPUs will be cheaper to manufacture. We only saw a big leap in price for the 20xx GPUs because new tech is expensive.

I think Nvidia will opt out of giving a discounted price to the consumer and will opt in to increasing their profit margins. Mark my words....

I don't think so, the Nvidia share price has tanked since they released the 20s cards at such a high price because they haven't sold as many as they expected. They'll bring the prices down for their 21s.

"This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

Same here - if you want 45 fps to turn into 90 fps, a 100% upgrade is the only way to be sure.

I more or less quit gaming for some years - then I upgraded from GTX 680 to 1080 - and for fun bought some fancy face-glasses delivered from Holland, and suddenly I'm in here. I wonder what the next big upgrade will bring

I stuck with my Powercolor 2GB 4870 for YEARS because it was a real beast of a card and continued to punch above its weight for so long, and I also used to do a fair bit of gaming on consoles too so I didn't game using my PC for the most part during the last few years before I built my VR machine.

I'm probably going to stick with my 1080 for a while too I think, I'm pretty sure that the CV2 will be able to run with one and the Pimax 5K+ when they eventually sort out their Brainwarp as well.

"This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

I'm sticking with a Geforce 1080 until Geforce 2080 Ti and 4k televisions are a lot cheaper. I game with maxed out setting on a Sony Bravia at 1080p and my games look great. They might look better in 4k, but what you don't see you don't miss. Meaning I haven't seen what games look like in 4k or HDR..

Yup, I'm due for getting myself a new telly. I normally get a new one every 5 years or so but my spending this year is going to be going towards getting myself a Yaw VR motion simulator this year, probably in November.

But that would leave me getting a new telly in 2020 and I'm planning on getting myself either a CV2 or a Pimax 5K+ that year.

I need to win the Euromillions. Stat.

"This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

I'm sticking with a Geforce 1080 until Geforce 2080 Ti and 4k televisions are a lot cheaper. I game with maxed out setting on a Sony Bravia at 1080p and my games look great. They might look better in 4k, but what you don't see you don't miss. Meaning I haven't seen what games look like in 4k or HDR..

Aint that the truth. I've been gaming in 4k since the 980ti (with reduced settings on that, that is) and just can't do 1080p anymore since. Although, I'm able to do VR in it's present rez...bc it's so much more immersive...but once you see it, there's no turning back...unless one just can't do 60fps from 144...LOL!!!