AMD Radeon Vega RX 64 8GB Review
Why you can trust Tom's Hardware
Conclusion
Let's get the messy stuff out of the way first. Radeon RX Vega 64 is late. It's hot. It's aimed at the competition's third-fastest product (which is 15 months old, uses a lot less power, and is quieter). And a lot of the architecture's new features are future-looking, rather than beneficial today.
AMD chose a $500 price point to match the 1080, then gave Nvidia enough time to make sure cryptocurrency-inflated prices on its cards were reigned in ahead of Vega 64's launch. So now you have a whole handful of third-party GTX 1080s in stock at $500 online.
Yes, AMD does surprise us with performance that typically exceeds our expectations. Based on the company's earlier hints, we were anticipating Radeon RX Vega 64 to tie GTX 1080, at best. However, AMD enjoys an advantage in Doom, The Division, and Dawn of War III. It roughly matches GeForce GTX 1080 in Ashes of the Singularity, Hitman, Metro, and Rise of the Tomb Raider. And it only really loses in Ghost Recon and The Witcher 3. The card is exceptional for 2560x1440 and respectable at Ultra HD, where you'll need to make quality compromises in certain games for smooth frame rates.
Of course, AMD had to flog its Vega 10 GPU to get there. Gaming power consumption in excess of 280W is particularly painful when GeForce GTX 1080 is 100W lower. Even the much faster GeForce GTX 1080 Ti barely passes the 210W mark, based on our measurements. Obviously this isn't an ideal situation, especially when we factor in the temperature and noise measurements. So Vega 64 includes two BIOSes with three power profiles each, allowing enthusiasts to dial in the right balance between performance and acoustics. We plan to test the various outcomes of these settings, but suspect that enthusiasts paying top-dollar for high-end graphics won't want to readily give up frame rates in exchange for a quieter fan. After all, certain GeForce GTX 1080 partner cards already address Vega's shortcomings and have been sitting on shelves for months.
We're hopeful that miners won't snatch up what stock of Radeon RX Vega 64 is made available at launch, so gamers at least have the opportunity to choose between AMD and Nvidia. The Ethereum hash rates we measured using Claymore's GPU Miner v.9.8 were higher than Radeon R9 Fury X, but not so compelling that we'd expect a rush on these $500 cards. For its part, AMD tries to stack the deck in favor of enthusiasts with its Radeon Packs. But to get the $300 in discounts, you have to also purchase an $850 Samsung CF791 monitor, Ryzen 7 1800X, and 370X-based motherboard. It'd be great to see AMD expand the list of options, and we do recognize the company's attempt to keep miners from pricing out everyone else.
There remain plenty of questions to answer about Radeon RX Vega 64. How might we see the driver-based High-Bandwidth Cache Controller option used on the desktop moving forward? Where does the Draw Stream Binning Rasterizer affect performance most? How long will it take game developers to embrace the idea of primitive shaders, and what real-world impact might they have on geometry throughput? The same goes for Rapid Packed Math; we've already seen demos of 16-bit data types improving frame rates without affecting quality. But when will gamers realize a return on buying into this technology?
You see, there's a lot of interesting stuff happening under the hood of Vega. Some of it, like the inherent pricing advantage of FreeSync-capable monitors versus G-Sync, is accessible today. A lot isn't, though. And the card's environmental weaknesses are certainly palpable in daily use. AMD couldn't leave any performance on the table if it wanted a shot at GTX 1080, and that much is apparent in its many features and settings designed to bring power back the other way. We are glad to now have a choice between GeForce and Radeon at the $500 price point. But we don't see the outcome as a definitive win in any one metric.
What prospective buyers of Radeon RX Vega 64 cards may be hoping for are big gains over time, similar to what we saw from Radeon R9 290X and Radeon R9 Fury X. AMD's driver teams have a knack for extracting additional performance from new hardware designs, and this generation is doubly promising for all of the untapped potential tied to Rapid Packed Math, primitive shaders, the Draw Stream Binning Rasterizer functionality, and Vega 10's HBCC. Potential isn't enough to earn our endorsement, but it's clearly part of AMD's Vega story.
Radeon RX Vega 56 is up next on the bench, and AMD seems far more confident in that card's ability to compete against GeForce GTX 1070 in a price/performance face-off. We're looking to add more power analysis, comparisons in VR, and a week-long look at RX Vega availability. Stay tuned!
MORE: Best Graphics Cards
MORE: Desktop GPU Performance Hierarchy Table
MORE: All Graphics Content
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
-
10tacle We waited a year for this? Disappointing. Reminds me of the Fury X release which was supposed to be the 980Ti killer at the same price point ($649USD if memory serves me correctly). Then you factor in the overclocking ability of the GTX 1080 (Guru3D only averaged a 5% performance improvement overclocking their Vega RX 64 sample to 1700MHz base/boost clock and a 1060MHz memory clock). This almost seems like an afterthought. Hopefully driver updates will improve performance over time. Thankfully AMD can hold their head high with Ryzen.Reply -
Sakkura For today's market I guess the Vega 64 is acceptable, sort of, since the performance and price compare decently with the GTX 1080. It's just a shame about the extreme power consumption and the fact that AMD still has no answer to the 1080 Ti.Reply
But I would be much more interested in a Vega 56 review. That card looks like a way better option, especially with the lower power consumption. -
envy14tpe Disappointing? what. I'm impressed. Sits near a 1080. Keep that in mind when thinking that FreeSync sells for around $200 less than Gsync. So pair that with this GPU and you have awesome 1440p gaming.Reply -
SaltyVincent This was an excellent review. The Conclusion section really nailed down everything this card has to offer, and where it sits in the market.Reply -
10tacle 20060001 said:Disappointing? what. I'm impressed. Sits near a 1080.
The GTX 1080 has been out for 15 months now, that's why. If AMD had this GPU at $50 less then it would be an uncontested better value (something AMD has a historic record on both in GPUs and CPUs). At the same price point however to a comparable year and three month old GPU, there's nothing to brag about - especially when looking at power use comparisons. But I will agree that if you include the cost of a G-Sync monitor vs. a FreeSync monitor, at face value the RX 64 is the better value than the GTX 1080. -
redgarl It`s not a bad GPU, however I would not buy one. I am having an EVGA 1080 FTW that I am living to hate (2 RMAs in 10 months), however even if I wanted to switch to Vega, might not be a good idea. It will not change anything.Reply
However two Vega 56 in CF might be extremely interesting. i did that with two 290x 2 years ago and it might be still the best combo out there. -
blppt IIRC, both AMD and Nvidia are moving away from CF/SLI support, so you'd have to count on game devs supporting DX12 mgpu (not holding my breath on that one for the near future).Reply -
cknobman I game at 4k now (just bought 1080ti last week) and it appears for the time being the 1080ti is the way to go.Reply
I do see promise in the potential of this new AMD architecture moving forward.
As DX12 becomes the norm and more devs take advantage of async then we will see more performance improvements with the new AMD architecture.
If AMD can get power consumption under control then I may move back in a year or two.
Its a shame too because I just built a Ryzen 7 rig and felt a little sad combining it with an Nvidia gfx card. -
AgentLozen I'm glad that AMD has a video card for enthusiasts who run 144hz monitors @ 1440p. The RX 580 and Fury X weren't well suited for that. I'm also happy to see that Vega64 can go toe to toe with the GTX 1080. Vega64 and a Freesync monitor are a great value proposition.Reply
That's where the positives end. I'm upset with the lack of progress since Fury X like everyone else. There was a point where Fury X was evenly matched with nVidia's best cards during the Maxwell generation. Nvidia then released their Pascal generation and a whole year went by before a proper response from AMD came around. If Vega64 launched in 2016, this would be totally different story.
Fury X championed High Bandwidth Memory. It showed that equipping a video card with HBM could raise performance, cut power consumption, and cut physical card size. How did HBM2 manifest? Higher memory density? Is that all?
Vega64's performance improvement isn't fantastic, it gulps down gratuitous amounts of power, and it's huge compared to Fury X. It benefits from a new generation of High Bandwidth memory (HBM2) and a 14nm die shrink. How much more performance does it receive? 23% in 1440p. Those are Intel numbers!
Today's article is a celebration of how good Fury X really was. It still holds up well today with only 4GB of video memory. It even beat the GTX 1070 is several benchmarks. Why didn't AMD take the Fury X, shrink it to 14nm, apply architecture improvements from Polaris 10, and release it in 2016? That thing would be way better than Vega64.
edit: Reworded some things slightly. Added a silly quip. 23% comes from averaging the differences between Fury X and Vega64. -
zippyzion Well, that was interesting. Despite its flaws I think a Vega/Ryzen build is in my future. I haven't been inclined to give NVidia any of my money for a few years now, since a malfunction with an FX 5900 destroyed my gaming rig... long story. I've been buying ATI/AMD cards since then and haven't felt let down by any of them.Reply
Let us not forget how AMD approaches graphics cards and drivers. This is base performance and baring any driver hiccups it will only get better. On top of that this testing covers the air cooled version. We should see better performance on the water cooled version that would land it between the 1080 and the Ti.
Also, I'd really like to see what low end and midrange Vega GPUs can do. I'm interested to see what the differences are with the 56, as well as the upcoming Raven Ridge APU. If they can deliver RX 560 (or even just 550) performance on an APU, AMD will have a big time winner there.