The AMD Radeon RX 590 Review, feat. XFX & PowerColor: Polaris Returns (Again)
by Nate Oh on November 15, 2018 9:00 AM ESTPower, Temperature, and Noise
As always, we'll take a look at power, temperature, and noise of the Radeon RX 590. As a custom-only specification, this means that we will be looking at solely AIB vendor designs. With the RX 590, we already know what to expect with existing RX 580 boards and coolers.
As this is a new GPU, we will quickly review stock voltages and clockspeeds as well.
AMD RX Series Video Card Voltages | ||||
Boost | Idle | |||
Radeon RX 590 | 1.1563V | 0.8000V | ||
Radeon RX 580 | 1.1625v | 0.7625v | ||
Radeon RX 480 | 1.0625v |
Power Consumption
For all the gaming performance gains that the RX 590 has made, it came with the higher clockspeeds, and to bring those higher clockspeeds came more power. Already, TBPs have notably increased from the RX 480's 150W to the RX 580's 185W, and now to the RX 590's 225W. Which is already past RX Vega 56's 210W reference board power spec.
Idle power consumption doesn't show anything out of the ordinary.
The RX 590's load power consumption is a slightly different story. For the RX 580 launch, we mused that this is where AMD paid the piper. For the RX Vega launch, I commented that the piper had then taken AMD to the cleaners. For the RX 590 today, I thought there wasn't any more the piper wanted to take, but there was.
From the wall, the RX 590 now pulls 30 to 45W more than the RX 580 in Battlefield 1. The difference in FurMark is even starker, with the RX 590 now drawing 45 to 80W more. Naturally, the power delta gets higher when comparing to the RX 480, let alone the GTX 1060 6GB FE. In Battlefield 1, that's 110W or more system consumption than the GTX 1060 6GB FE for what is panning out to be around 10% faster performance. It's clear that the RX 590 is not in the same league - or anywhere close - to the GTX 1060 in terms of power efficiency.
Temperature
With all that power, heat and temperature can easily become an issue. But as both a non-reference launch and a product refresh, the featured open air axial fan designs are tried-and-true, and already configured to dissapate similar thermals.
Noise
Likewise with noise, the RX 590 can benefit from zero dB functionality, where fans turn off under certain temperatures.
Additionally, a quick glance at RX 590 power consumption at -25% and -50% power limits show that like the RX Vega, RX 480, and RX 580, Polaris 30 is well past the optimal point on the voltage curve with the clocks at hand.
136 Comments
View All Comments
El Sama - Thursday, November 15, 2018 - link
To be honest I believe that the GTX 1070/Vega 56 is not that far away in price and should be considered as the minimum investment for a gamer in 2019.Dragonstongue - Thursday, November 15, 2018 - link
over $600 for a single GPU V56, no thank you..even this 590 is likely to be ~440 or so in CAD, screw that noise.minimum for a gamer with deep pockets, maybe, but that is like the price of good cpu and motherboard (such as Ryzen 2700)
Cooe - Thursday, November 15, 2018 - link
Lol it's not really the rest of the world's fault the Canadian Dollar absolutely freaking sucks right now. Or AMD's for that matter.Hrel - Thursday, November 15, 2018 - link
Man, I still have a hard 200 dollar cap on any single component. Kinda insane to imagine doubling that!I also don't give a shit about 3d, virtual anything or resolutions beyond 1080p. I mean ffs the human eye can't even tell the difference between 4k and 1080, why is ANYONE willing to pay for that?!
In any case, 150 is my budget for my next GPU. Considering how old 1080p is that should be plenty.
igavus - Friday, November 16, 2018 - link
4k and 1080p look pretty different. No offence, but if you can't tell the difference, perhaps it's time to schedule a visit with an optometrist? Nevermind 4K, the rest of the world will look a lot better also if your eyes are okay :)Great_Scott - Friday, November 16, 2018 - link
My eyes are fine. The sole advantage of 4K is not needing to run AA. That's about it.Anyone buying a card just so they can push a solid framerate on a 4K monitor is throwing money in the trash. Doubly so if they aren't 4->1 interpolating to play at 1K on a 4K monitor they needed for work (not gaming, since you don't need to game at 4K in the first place).
StevoLincolnite - Friday, November 16, 2018 - link
There is a big difference between 1080P and 4k... But that is entirely depending on how large the display is and how far you sit away from said display.Otherwise known as "Perceived Pixels Per Inch".
With that in mind... I would opt for a 1440P panel with a higher refresh rate than 4k every day of the week.
wumpus - Saturday, November 17, 2018 - link
Depends on the monitor. I'd agree with you when people claim "the sweet spot of 4k monitors is 28 inches". Maybe the price is good, but my old eyes will never see it. I'm wondering if a 40" 4k TV will make more sense (the dot pitch will be lower than my 1080P, but I'd still likely notice lack of AA).Gaming (once you step up to the high end GPUs) should be more immersive, but the 2d benefits are probably bigger.
Targon - Saturday, November 17, 2018 - link
There are people who notice the differences, and those who do not. Back in the days of CRT monitors, most people would notice flicker with a 60Hz monitor, but wouldn't notice with 72Hz. I always found that 85Hz produced less eye strain.There is a huge difference between 1080p and 2160p in terms of quality, but many games are so focused on action that the developers don't bother putting in the effort to provide good quality textures in the first place. It isn't just about not needing AA as much as about a higher pixel density and quality with 4k. For non-gaming, being able to fit twice as much on the screen really helps.
PeachNCream - Friday, November 16, 2018 - link
I reached diminishing returns at 1366x768. The increase to 1080p offered an improvement in image quality mainly by reducing jagged lines, but it wasn't anything to get astonished about. Agreed that the difference between 1080p and 4K is marginal on smaller screens and certainly not worth the added demand on graphics power to push the additional pixels.