kirottu Posted June 29, 2016 Posted June 29, 2016 http://www.guru3d.com/articles_pages/amd_radeon_r9_rx_480_8gb_review,1.html TLDR: Good card for those who haven't upgraded a while. 2 This post is not to be enjoyed, discussed, or referenced on company time.
teknoman2 Posted June 29, 2016 Posted June 29, 2016 there were some issues reported by many reviewers that the card, while having a TDP of 150W could suck up to 230W in some cases, while others have found it to draw as low as 137W on full speed and OC. AMD says the 8GB model should be at 130W and they are looking into it but some theorize that the problem is only present in review cards that have a switch that makes them work as both a 4GB and a 8GB model at will. that, along with a driver bug on the driver version the reviewers used, could fool the system into thinking there are 2 cards and it tried to squeeze another 130W through the PCIe... or at least as many W as the slot could manage to let through. there was also a benchmark with crossfire in 15 games and in some games 2x480 marginally beat the 1080 while in most it was very close. the CF efficiency ranged from 50% to 93% The words freedom and liberty, are diminishing the true meaning of the abstract concept they try to explain. The true nature of freedom is such, that the human mind is unable to comprehend it, so we make a cage and name it freedom in order to give a tangible meaning to what we dont understand, just as our ancestors made gods like Thor or Zeus to explain thunder. -Teknoman2- What? You thought it was a quote from some well known wise guy from the past? Stupidity leads to willful ignorance - willful ignorance leads to hope - hope leads to sex - and that is how a new generation of fools is born! We are hardcore role players... When we go to bed with a girl, we roll a D20 to see if we hit the target and a D6 to see how much penetration damage we did. Modern democracy is: the sheep voting for which dog will be the shepherd's right hand.
ManifestedISO Posted June 29, 2016 Posted June 29, 2016 I think the article sets a new benchmark, 36 pages. Fortunately it's well-written. Better make a sandwich and get started ... All Stop. On Screen.
Humanoid Posted June 30, 2016 Posted June 30, 2016 It's not terribly exciting, yeah, which makes for a pretty simple summary: a) If you have a 290/390/970/980 series card now, then go away, there's nothing to see here. b) If you don't already have one of those cards, then essentially the cost of your upgrade to that tier has been reduced by about one-quarter to one-third. Some more observations: 1) The reference cooler sucks. As always, wait for custom models. 2) Power consumption is disappointing. It's not a problem as such, cards have drawn over PCI-E spec before with no issue, but for a company who used to publish very conservative TDPs, they've now joined nVidia in using it as a marketing tool to look good on paper. 3) Performance is as expected, nothing less but also nothing more. 4) Overclocking headroom is nearly non-existent but it scales well with what little there is. This suggests a fair bit of room for improvement with the custom models. 5) It's observable that it does perform better with newer titles, so while performance today is barely over the 970/390, it really should creep up over the coming months (and years) to end up closer to the next tier I suspect. 6) It's freakishly good at running Hitman for some reason, pretty much matching a 980Ti. If the one and only game you play is Hitman I guess that's good news? L I E S T R O N GL I V E W R O N G
teknoman2 Posted June 30, 2016 Posted June 30, 2016 where the hell do they come up with these prices in this economy? the reference 480 8GB here is sold for around 350 euro... that's almost 400$. same card in france is sold for 265 euro. may as well order one from amazon.fr, even if i have to add a shipping fee it will still be cheaper The words freedom and liberty, are diminishing the true meaning of the abstract concept they try to explain. The true nature of freedom is such, that the human mind is unable to comprehend it, so we make a cage and name it freedom in order to give a tangible meaning to what we dont understand, just as our ancestors made gods like Thor or Zeus to explain thunder. -Teknoman2- What? You thought it was a quote from some well known wise guy from the past? Stupidity leads to willful ignorance - willful ignorance leads to hope - hope leads to sex - and that is how a new generation of fools is born! We are hardcore role players... When we go to bed with a girl, we roll a D20 to see if we hit the target and a D6 to see how much penetration damage we did. Modern democracy is: the sheep voting for which dog will be the shepherd's right hand.
Zoraptor Posted June 30, 2016 Posted June 30, 2016 480/ 8gig is NZD500 (USD 350) here. Ironically, you can now get a 390X for slightly less than that amount instead, when a few weeks ago it was very rare to see them for less than $700.
Humanoid Posted June 30, 2016 Posted June 30, 2016 Cheapest here is $369AUD. Minus 10% tax and then converting to USD, that's just under $250. Considering the RRP of $239USD, that's actually very good, if only because the Aussie dollar is steadily dropping value. On the other hand, I got my 290X which is roughly equivalent for $405AUD a year ago so the value proposition has mostly been wiped out by the currency movement and the clearance status of the 2xx series back then. The best time to buy a new video card recently was in late 2014 when 290s were being cleared out for pretty close to $200USD. Essentially what the RX 480 has done is just revert us back to that state in terms of price-performance, nearly two years on. Looking at it like that, you could be forgiven for being disappointed at the new card, but then also bear in mind that the intervening time has been one of the worst ever times to buy a new card from a value perspective, so it's good to see a product get us out of that rut. Can't help but think that the RX 470 will be the pick of the Polaris litter though, the chip seems to respond very well to downclocking slightly. P.S. There seems to be a lot of variance in the silicon lottery this time around, but fps gains seem almost directly proportional to clock speeds. Depending on the average clocks achieved by the custom boards (assuming they don't go wildly over the power limit by doing so), it's possible for the chip to jump a tier and reach Fury levels of performance. L I E S T R O N GL I V E W R O N G
Blarghagh Posted June 30, 2016 Posted June 30, 2016 Gonna get the 4GB variant when a custom cooler variant comes out, provided they don't mark it up significantly. Fits nicely in my budget.
Humanoid Posted June 30, 2016 Posted June 30, 2016 I can't read German, but computerbase.de supposedly did a test where they manually reduced voltage to the card. Temperatures dropped, power use dropped by 30W ....and performance went UP because it was no longer hitting the thermal/power ceiling. Like a horse, Polaris out of the box is being whipped too hard and performance instead drops as a consequence. AMD, why do you do this to yourselves? L I E S T R O N GL I V E W R O N G
teknoman2 Posted June 30, 2016 Posted June 30, 2016 (edited) i was reading something interesting on redit regarding the over 200W power consumption issue: passing more than 75W through PCIe is actually illegal. so i don't think AMD deliberately said 150W TDP, put a 6pin and said "let the other 130W pass through the PCIe and if it burns the motherboard who cares". no matter how big or small a company is, i don't think they would want to get sued and waste money in a courtroom. hmm, considering the above post, i think the most probable reason for the power issues may actually be a bad implementation of clock boost. from what i know AMD never had a boost in their cards before so the one they made for the 480 may not be working right. Edited June 30, 2016 by teknoman2 The words freedom and liberty, are diminishing the true meaning of the abstract concept they try to explain. The true nature of freedom is such, that the human mind is unable to comprehend it, so we make a cage and name it freedom in order to give a tangible meaning to what we dont understand, just as our ancestors made gods like Thor or Zeus to explain thunder. -Teknoman2- What? You thought it was a quote from some well known wise guy from the past? Stupidity leads to willful ignorance - willful ignorance leads to hope - hope leads to sex - and that is how a new generation of fools is born! We are hardcore role players... When we go to bed with a girl, we roll a D20 to see if we hit the target and a D6 to see how much penetration damage we did. Modern democracy is: the sheep voting for which dog will be the shepherd's right hand.
Humanoid Posted June 30, 2016 Posted June 30, 2016 (edited) Um, it's out of spec, but hardly "illegal". It's happened before, though to lesser degrees, with both companies in the past. Nothing bad happened, no one went to court, no one was fined, and to my knowledge no hardware broke because of it. And that's just for out of the box products, overclockers regularly run their cards such that they consume power way beyond what their connectors are technically rated for. Heavily overclocked high-end cards that consume 400W of power while nominally rated for 300W, no problem in practice. Also if those >200W reports are true then I suspect they're one-off faults with individual cards, the widespread issue is that they're drawing ~160-170W while technically the PCI-E slot and the 6-pin power connector are only rated to 75W each. Being officially rated to something has no real correlation to what the connector is actually able to deliver. I've read before that it's perfectly viable to draw 200W through a 6-pin connector, only above that you might start running into the risk of heat damage/melting. So let's assume a worst case of 170W power draw, and it's taking 75W from the 6-pin connector, i.e. the excess over-spec power is all being drawn through the slot. That's 95W, so 20W over spec worst case. In reality it might just be drawing 80W from each. Sure, it's a little naughty and looks a bit unprofessional, but in the grand scheme of things inconsequential. The 750Ti and 950 are known to draw 5-10W over spec at times too. Edited June 30, 2016 by Humanoid L I E S T R O N GL I V E W R O N G
teknoman2 Posted June 30, 2016 Posted June 30, 2016 maybe illegal is strong word but the holder of the PCIe trademark is obligated to take to court anyone who makes a PCIe hardware that draws more than 75W through the slot or they lose the trademark. so a hardware company is not allowed to make a PCIe product that draws more than 75W through the PCIe because of trademark infringement. The words freedom and liberty, are diminishing the true meaning of the abstract concept they try to explain. The true nature of freedom is such, that the human mind is unable to comprehend it, so we make a cage and name it freedom in order to give a tangible meaning to what we dont understand, just as our ancestors made gods like Thor or Zeus to explain thunder. -Teknoman2- What? You thought it was a quote from some well known wise guy from the past? Stupidity leads to willful ignorance - willful ignorance leads to hope - hope leads to sex - and that is how a new generation of fools is born! We are hardcore role players... When we go to bed with a girl, we roll a D20 to see if we hit the target and a D6 to see how much penetration damage we did. Modern democracy is: the sheep voting for which dog will be the shepherd's right hand.
Bartimaeus Posted July 8, 2016 Posted July 8, 2016 (edited) For the record, and since it was mentioned here, the PCIE problem was partially fixed by a recent driver update. According to what I've read, it's still just a little out of specifications, but, as one person I read put it, "[...]probably out of the danger zone now." Edited July 8, 2016 by Bartimaeus Quote How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart. In my dreams, I am not crippled. In my dreams, I dance.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now