Jump to content

Welcome to Obsidian Forum Community
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

AMD R9 290X

Trouncing titan holy crap

  • Please log in to reply
74 replies to this topic

#21
Humanoid

Humanoid

    Arch-Mage

  • Members
  • 3840 posts

Cypress/Northern Islands and Fermi were both 40nm, so the node cadence isn't really that unusual: 28nm debuted with the 7970. But the GPU manufacturers now face competition for access to fabs from the mobile SoC devices, and when faced with the choice of selling to nV/AMD or to Apple (who are looking to shift most of their iPhone silicon away from Samsung for obvious reasons), well, you can guess who has the better influence there.

 

And yeah, while power consumption for a single-GPU card is, I believe, second only to the maligned GTX480, I'd still say it's well within reason. Indeed at with both CPU and GPU at stock clocks, I'd be okay running it off a ~450W PSU. It is, after all, only ~10-20% more than where the previous gen ended up powerwise.


  • samm likes this

#22
samm

samm

    (8) Warlock

  • Members
  • 1167 posts
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Silver Backer
  • Fig Backer

You're right. Regarding the timeline, my memory was really inaccurate. The HD 4770, containing the first 40nm GPU architecture, debuted in early 2009, and Cayman as the last one debuted at the end of 2010, so a good two years between them. Almost the same time span between Tahiti and Hawaii.


Edited by samm, 02 November 2013 - 04:42 PM.


#23
Gorgon

Gorgon

    Forum Moderator

  • Moderators
  • 4763 posts

I've wondered with high end high power consumption cards and the commonplace of CPU or board based graphics why someone hasn't though of making a card that could be turned on and off like a harddisk, working with light load and 3dmode as well as a manual toggle. There would probably need to be a motherboard standard as well. Maybe there already is ?



#24
Humanoid

Humanoid

    Arch-Mage

  • Members
  • 3840 posts

It exists, but only really implemented for mobile platforms. Check nVidia's Optimus which is advertised on most notebooks, and there's an AMD equivalent whose name escapes me - though it's not quite as mature. Hopefully it makes the leap soon, but then again, AMD's performance CPUs still don't have integrated graphics, and neither do Intel's -E CPUs.



#25
Spider

Spider

    Arch-Mage

  • Members
  • 2172 posts
  • Location:Sweden
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Backer
  • Fig Backer

Wasn't there third party that tried to implement that with high power cards and the intel integrated gpu platform? The problem was that if you connected your monitor to the graphics card you couldn't use it and if you connected it to the integrated card you could use the feature, but you got throttled by the lower performing components in the integrated graphics.

 

So it lowered performance a fair bit, to the point of it being pontless.



#26
AwesomeOcelot

AwesomeOcelot

    (9) Sorcerer

  • Members
  • 1327 posts
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Silver Backer
  • Fig Backer
Virtu by Lucidlogix, it didn't get throttled there was a driver overheard which cost on average 5 frames per second. It was pretty pointless though because even though you could switch from discrete to integrated at will it couldn't power down the discrete, if it could have done that it would have been great. There was some benefits to it, I think Intel integrated GPUs do some things really well like video processing.

#27
samm

samm

    (8) Warlock

  • Members
  • 1167 posts
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Silver Backer
  • Fig Backer

I've wondered with high end high power consumption cards and the commonplace of CPU or board based graphics why someone hasn't though of making a card that could be turned on and off like a harddisk, working with light load and 3dmode as well as a manual toggle. There would probably need to be a motherboard standard as well. Maybe there already is ?

In a way, it is no longer very necessary. Cards have suitable power saving modes when nothing is demanded from them, i.e. on Desktop, lowering clockspeed and voltage significantly so the dissipate no more than about 10-20W for the high end ones. If the screen goes to sleep / is plugged of for a sufficient amount of time, newer AMD cards power down to about 3W. The switch to an integrated GPU would not save that much anymore.



#28
Gorgon

Gorgon

    Forum Moderator

  • Moderators
  • 4763 posts

Doesn't it say 88W idle from the first link, and 300 something under load. ATI are deliberately trying to obscure how bad it is, I'm sure they are aware it's a point of contention. 

 

http://anandtech.com...-290x-review/19



#29
AwesomeOcelot

AwesomeOcelot

    (9) Sorcerer

  • Members
  • 1327 posts
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Silver Backer
  • Fig Backer

Doesn't it say 88W idle from the first link, and 300 something under load. ATI are deliberately trying to obscure how bad it is, I'm sure they are aware it's a point of contention. 
 
http://anandtech.com...-290x-review/19


Yes, that's what I thought, high end discrete GPUs use over double the power of e.g. Haswell's integrated GPU at idle.

#30
samm

samm

    (8) Warlock

  • Members
  • 1167 posts
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Silver Backer
  • Fig Backer

Doesn't it say 88W idle from the first link, and 300 something under load. ATI are deliberately trying to obscure how bad it is, I'm sure they are aware it's a point of contention. 

 

http://anandtech.com...-290x-review/19

Total system power consumption - it's not just the card. What are you suggesting about ATI (AMD?) obscuring something? I don't quite get that part of your posting.

 

Yes, that's what I thought, high end discrete GPUs use over double the power of e.g. Haswell's integrated GPU at idle.
Erm yes, but in absolute numbers it's a rather small percentage of a desktop's power consumption at idle, even if it is 20W vs. 5W. In the mobile sector such differences would be huge of course, but there's nothing anywhere near as powerful as the discreet desktop cards.

#31
Deraldin

Deraldin

    Arch-Mage

  • Members
  • 4033 posts
  • Location:Somewhere cold
  • Pillars of Eternity Gold Backer

Doesn't it say 88W idle from the first link, and 300 something under load. ATI are deliberately trying to obscure how bad it is, I'm sure they are aware it's a point of contention. 

 

http://anandtech.com...-290x-review/19

Keep in mind that the figures in those graphs are total power draw from the wall. They are measuring total system consumption, not just that of the graphics card. That said, Anand does mention that the GPU is drawing 5-10 watts more than it probably should be with the drivers that they were using to test.

 

EDIT: Bah, Samm beat me to it. That's what happens when I get distracted and end up reading the whole page. >_<


Edited by Deraldin, 03 November 2013 - 02:20 PM.


#32
Spider

Spider

    Arch-Mage

  • Members
  • 2172 posts
  • Location:Sweden
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Backer
  • Fig Backer

Virtu by Lucidlogix, it didn't get throttled there was a driver overheard which cost on average 5 frames per second. It was pretty pointless though because even though you could switch from discrete to integrated at will it couldn't power down the discrete, if it could have done that it would have been great. There was some benefits to it, I think Intel integrated GPUs do some things really well like video processing.

 

Don't know if it improved after launch, I just remember reading this review when it was new (in swedish unfortunately, but the diagrams tell most of the story):

http://www.sweclocke...idge/8#pagehead

 

Basically when going through the integrated graphics card, graphics heavy tasks like games or high definition video playback just wouldn't work. And in benchmarking it scored less than a third of the points compared to the external card by itself.

 

basically never saw it mentioned after that, so drivers could have improved a lot I suppose.



#33
AwesomeOcelot

AwesomeOcelot

    (9) Sorcerer

  • Members
  • 1327 posts
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Silver Backer
  • Fig Backer
If you look at the chart "Lucid D-l├Ąge" (D-Mode) means HDMI connected to the motherboard, pretty much performs close to not using Lucid at all, you said the opposite. One thing it allowed was to use Intel's Quick Sync to transcode video about 3 times faster than my HD 6870.

Edited by AwesomeOcelot, 03 November 2013 - 03:53 PM.


#34
Spider

Spider

    Arch-Mage

  • Members
  • 2172 posts
  • Location:Sweden
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Backer
  • Fig Backer

No, d is connected to the graphics card, i is connected to the motherboard. But yes, d-mode does allow the use of quicksync.


Edited by Spider, 04 November 2013 - 12:51 AM.


#35
Humanoid

Humanoid

    Arch-Mage

  • Members
  • 3840 posts

We'll find out for sure in a day or so, but noises coming out of the tech sites are that the R9 290 samples are performing *faster* than the 290X based on the initial reviews. Explanation is simple enough, the newer drivers they've pushed out have made up more than the difference in hardware. That they've managed to get that performance up so soon after release is good for them, in that without it, the cut-priced GTX780s were occupying the same point on the price-performance curve as the 290X (as in 10% cheaper, 10% slower) while having an actual halfway decent cooler.

 

nVidia would have set their performance target for the 780Ti at just above the level of the 290X on launch. This might make it interesting.



#36
AwesomeOcelot

AwesomeOcelot

    (9) Sorcerer

  • Members
  • 1327 posts
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Silver Backer
  • Fig Backer

No, d is connected to the graphics card, i is connected to the motherboard. But yes, d-mode does allow the use of quicksync.


You're right. Never the less, the performance of I-Mode using the discrete GPU was negligible in games and video.


Edited by AwesomeOcelot, 04 November 2013 - 09:51 AM.


#37
Morgoth

Morgoth

    Arch-Mage

  • Members
  • 10099 posts
  • Pillars of Eternity Backer
  • Kickstarter Backer

This is why I don't like AMD cards:

 



#38
mkreku

mkreku

    Arch-Mage

  • Members
  • 8561 posts
  • Location:Uppsala, Sweden

..and this is why I like them:

 

http://www.hardocp.c...eo_card_review/

 

Faster than the GTX780 but $100 cheaper. Loud, powerhungry and fast. Lovely!


  • ManifestedISO likes this

#39
samm

samm

    (8) Warlock

  • Members
  • 1167 posts
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Silver Backer
  • Fig Backer

This is why I don't like AMD cards:

 

Awesome video :lol: Still, if anyone buys the card with reference cooler and keeps that piece of yesterday-tech installed, it's his own fault. Just looking at the 290 (non-X) and what difference cooler speed makes, I'd say investing in a custom cooler and slightly undervolting the card, without any overclocking, will propel it to Titan levels of performance for less than half the price.


Edited by samm, 05 November 2013 - 12:08 PM.


#40
Humanoid

Humanoid

    Arch-Mage

  • Members
  • 3840 posts

Tom's included a section covering performance with an Accelero Xtreme III in their review. The net gain is 13-20% depending on driver version, which is to say, if you take the cost of the cooler into account, it's still cheaper than the 780 at its new price point, and now squashes it in terms of performance.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users