The Vega 64
Before we talk about the Vega, let me take a short trip down memory lane. I remember back when AMD first unveiled the Fury X.
New products to compete against Nvidia? Absolutely sign me up! That’s how I remember my thought process going, at least according to memory.
I never actually bought the Fury X, unfortunately, and so waited a nice long while till the RX480 arrived. The results certainly pleased me. A budget card that filled my needs perfectly fine since I’m not the most stringent gamer.
And now, in 2017, AMD has finally unveiled their “top-of-the-line” Vega 64 series of chips.
See how there are quotations around that phrase? That’s because none of the Vega 64 chips directly compete with Nvidia’s top-of-the-line, 1080ti. That’s right, the Vega 64 is still, astoundingly, comparable to the 1080.
So AMD released a chip that’s over a year late to the party, has a 300 watt TDP, and is priced at the same level of a card that’s already been in the market for some time now. I’m not perfectly certain how this business strategy is meant to work, but given AMD’s track record, I’m not even sure they know how it’s going to work.
Joking aside, it’s about time to take a deeper look at this card, starting with…
So according to the slides that were presented, this is the sort of power we’re looking at.
Not half bad but keep in mind you’re still only getting 1080 performance for nearly twice the power consumption. Take a closer look at AMD’s benchmarks and we get to see a clearer picture.
This is a range of minimum FPS values that AMD gave out which shows how the Vega 64 vastly outperforms the GTX 1080 in gaming scenarios. Minimum FPS is important since sudden frame rate drops cause a sense of image stuttering that throws people off. It seems on the surface that AMD is handily beating out Nvidia for performance, but personal experiences and other reputable sources clearly shows a different story.
This is a snapshot of PCGamer’s recent benchmark testing on the cards themselves. The results clearly show that the 1080 is just the faster card in general – not by worst possible. At best, the Vega 64 can edge out the 1080 by a small margin. At worst it trails behind like an ox with a broken leg at 30%. When it comes to power consumption, the card eats up a staggering “478W at the outlet” while the 1080 only takes up 370W.
The 1080ti, which blows the Vega 64 out of the water, comes in at 475 watts.
Those numbers are coming in tandem with the CPU as well; the faster a card is, the harder the CPU has to work as well which means the difference between the solo power consumption between the 1080ti and the Vega 64 is actually larger than a measly 3 watts.
So does this mean the card falls staggeringly behind in the gaming category?
Well of course not!
If you get a Freesync monitor…
Radeon Packs – Okay I’m Sorry But What The Hell AMD?
Okay, so the Vega is a good card overall, but disappointing in that it only barely competes with a year old card, but luckily for all of us, AMD has given all of us the opportunity to afford a Freesync monitor via the Radeon Packs!
Before I begin to rag on this, I should make it clear that there is a solution to this problem. But ask yourself, why was this ever an issue at all?
So the Freesync monitor you receive is Samsung’s c34f791 which has two Freesync modes; the standard engine and the Ultimate Engine. For some ungodly reason, the Ultimate Engine just doesn’t work right out of the box. A friend recently got this monitor and ran into a very strange issue. When he showed me I was thoroughly unimpressed.
The monitor flickered like it’s having disco fever.
A monitor meant to show off AMD’s Freesync capabilities with their newest chips and it flickered so bad I couldn’t tell if my friend had fed me drugs beforehand.
When I searched the issue up, I was surprised to see that the flicker issues had existed since 2015. Now of course, the flickering was supposedly a driver issue but it obviously wasn’t. If it was then AMD either didn’t care – which is just disgusting – or legitimately couldn’t figure out a fix – which is about ten times worse – because the issue still exists.
So the monitor is obviously of poor quality, we came to that conclusion even after finally coming across a fix that worked. But if the monitor is just bad, that raises the question.
How on earth did AMD allow this to get by in the first place?
Did they not test this out beforehand? Did they test this out and just ignore it? Or did Samsung just give them a sweet enough deal that AMD just looked the other way entirely?
There isn’t really a single right answer since every answer will either boil down to “AMD was incompetent” or “AMD was being knowingly negligent.”
Basically, the Freesync monitors – which are supposed to bring AMD chips to a whole new level – are out of the box broken.
How is that good marketing?
For Machine Learning
So there’s been people throwing around the idea that some of these new cards will be able to break into machine learning market.
This might be the only area where AMD has a chance if I’m being frank.
Now, at the moment, AMD has pretty much no significant presence in the field due to the raw lack of software support which they are trying to solve via an open source github repo. You can read more on ROCm here as well as here. Now, while it certainly seems promising at a glance, one needs to remember that they still haven’t made any real headway into the market yet and every day spent not vigorously making stakes in the market is another day Nvidia has to further entrench their position.
Personally, I don’t believe the question is “can ROCm match up with cuDNN” but “when will ROCm match up with cuDNN?”
Take too long, and it might be too late.
Disclaimer: I do not own any shares of AMD nor do I plan on opening any positions in the near future. This article is made up of my own opinions and does not represent any other party or organization. I am receiving no compensation other than what I may receive from TickHounds.