Jump to content

mkreku

Recommended Posts

Virtu by Lucidlogix, it didn't get throttled there was a driver overheard which cost on average 5 frames per second. It was pretty pointless though because even though you could switch from discrete to integrated at will it couldn't power down the discrete, if it could have done that it would have been great. There was some benefits to it, I think Intel integrated GPUs do some things really well like video processing.

Link to comment
Share on other sites

I've wondered with high end high power consumption cards and the commonplace of CPU or board based graphics why someone hasn't though of making a card that could be turned on and off like a harddisk, working with light load and 3dmode as well as a manual toggle. There would probably need to be a motherboard standard as well. Maybe there already is ?

In a way, it is no longer very necessary. Cards have suitable power saving modes when nothing is demanded from them, i.e. on Desktop, lowering clockspeed and voltage significantly so the dissipate no more than about 10-20W for the high end ones. If the screen goes to sleep / is plugged of for a sufficient amount of time, newer AMD cards power down to about 3W. The switch to an integrated GPU would not save that much anymore.

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

Doesn't it say 88W idle from the first link, and 300 something under load. ATI are deliberately trying to obscure how bad it is, I'm sure they are aware it's a point of contention. 

 

http://anandtech.com/show/7457/the-radeon-r9-290x-review/19

Yes, that's what I thought, high end discrete GPUs use over double the power of e.g. Haswell's integrated GPU at idle.

Link to comment
Share on other sites

Doesn't it say 88W idle from the first link, and 300 something under load. ATI are deliberately trying to obscure how bad it is, I'm sure they are aware it's a point of contention. 

 

http://anandtech.com/show/7457/the-radeon-r9-290x-review/19

Total system power consumption - it's not just the card. What are you suggesting about ATI (AMD?) obscuring something? I don't quite get that part of your posting.

 

 

 

Yes, that's what I thought, high end discrete GPUs use over double the power of e.g. Haswell's integrated GPU at idle.
Erm yes, but in absolute numbers it's a rather small percentage of a desktop's power consumption at idle, even if it is 20W vs. 5W. In the mobile sector such differences would be huge of course, but there's nothing anywhere near as powerful as the discreet desktop cards.

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

Doesn't it say 88W idle from the first link, and 300 something under load. ATI are deliberately trying to obscure how bad it is, I'm sure they are aware it's a point of contention. 

 

http://anandtech.com/show/7457/the-radeon-r9-290x-review/19

Keep in mind that the figures in those graphs are total power draw from the wall. They are measuring total system consumption, not just that of the graphics card. That said, Anand does mention that the GPU is drawing 5-10 watts more than it probably should be with the drivers that they were using to test.

 

EDIT: Bah, Samm beat me to it. That's what happens when I get distracted and end up reading the whole page. >_<

Edited by Deraldin
Link to comment
Share on other sites

Virtu by Lucidlogix, it didn't get throttled there was a driver overheard which cost on average 5 frames per second. It was pretty pointless though because even though you could switch from discrete to integrated at will it couldn't power down the discrete, if it could have done that it would have been great. There was some benefits to it, I think Intel integrated GPUs do some things really well like video processing.

 

Don't know if it improved after launch, I just remember reading this review when it was new (in swedish unfortunately, but the diagrams tell most of the story):

http://www.sweclockers.com/recension/13939-intel-z68-for-sandy-bridge/8#pagehead

 

Basically when going through the integrated graphics card, graphics heavy tasks like games or high definition video playback just wouldn't work. And in benchmarking it scored less than a third of the points compared to the external card by itself.

 

basically never saw it mentioned after that, so drivers could have improved a lot I suppose.

Link to comment
Share on other sites

If you look at the chart "Lucid D-läge" (D-Mode) means HDMI connected to the motherboard, pretty much performs close to not using Lucid at all, you said the opposite. One thing it allowed was to use Intel's Quick Sync to transcode video about 3 times faster than my HD 6870.

Edited by AwesomeOcelot
Link to comment
Share on other sites

We'll find out for sure in a day or so, but noises coming out of the tech sites are that the R9 290 samples are performing *faster* than the 290X based on the initial reviews. Explanation is simple enough, the newer drivers they've pushed out have made up more than the difference in hardware. That they've managed to get that performance up so soon after release is good for them, in that without it, the cut-priced GTX780s were occupying the same point on the price-performance curve as the 290X (as in 10% cheaper, 10% slower) while having an actual halfway decent cooler.

 

nVidia would have set their performance target for the 780Ti at just above the level of the 290X on launch. This might make it interesting.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

This is why I don't like AMD cards:

 

https://www.youtube.com/watch?v=oV5vs27wnCA

Awesome video :lol: Still, if anyone buys the card with reference cooler and keeps that piece of yesterday-tech installed, it's his own fault. Just looking at the 290 (non-X) and what difference cooler speed makes, I'd say investing in a custom cooler and slightly undervolting the card, without any overclocking, will propel it to Titan levels of performance for less than half the price.

Edited by samm

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

Why "as well"? It's alright at idle ;) At load, it consumes some watts more than a Titan, while being somewhat slower, and incredibly much cheaper. Also, it consumes less than a R9 280x, while being faster, with a bigger chip, at the same process node. Put a decent cooler on it, and load consumption will drop 30W (http://ht4u.net/reviews/2013/r9_290_im_griff_prolimatech_mk-26_black_im_test/index7.php)

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

So yeah, it was a good thing that nVidia followed AMD's pricing down. But now they seem to be following AMD's noise levels up - at least if early numbers are to be believed.

 

Still a little longer until the NDA officially lifts, but some publications (*cough* videocardz.com) not bound by NDA have released their numbers. The 780Ti performs as expected (winning at lower resolutions), but its fan runs 5db louder than the Titan and 780 non-Ti coolers. Or rather, it's the same cooler, but forced to run faster to keep the higher-clocked Ti in check. Megahertz wars? Nah, it's the decibel wars!

 

Also, $699, probably. Blah, needed to be $599. EDIT2: $720 at Amazon

 

GTX-780-Ti-noise-850x555.jpg

 

EDIT: Power consumption figures too

Edited by Humanoid

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

  • 2 weeks later...

An early report that someone flashed a 290 into a full 290X. I didn't read the link, cause I'm starting to lose faith in compiler monkeys. Maybe it's just me, but there seems to be a rise in misinterpretation and a general confusion with news, opinion, and speculation all being equal. That would be cool, though, to save some cash and watts over a full-priced GPU and get the same performance.   

All Stop. On Screen.

Link to comment
Share on other sites

Multiple reports by now, apparently mostly Powercolor cards, but even there it's hit or miss. I'd say unlocking a 290 to a 290x is possible with about the same chance as becoming a victim of the curse of the black screen (current speculation revolves around Afterburner or sensor faults related to fan speed, even on fanless, i.e. watercooled, cards).

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

  • 1 month later...

Not an X yet, but now they're coming:

 

http://www.anandtech.com/show/7601/sapphire-radeon-r9-290-review-our-first-custom-cooled-290

 

41 dB, down from 57! Better performance than the 780 GTX (and sometimes even the 290X). Cheaper than both.

Swedes, go to: Spel2, for the latest game reviews in swedish!

Link to comment
Share on other sites

Cool, a quiet 290. I think everyone knew it was coming, except the guy who wrote that original article, with green ink in his typewriter. "I just can't recommend this card ... it's super powerful, unbelievably cheap ... but it's too loud, just too loud ... I ... it's just ... my Nvidia overlords tell me what to write ... reference blowers ... too loud ... no such thing as partner custom cooling ... don't buy Radeon." 

  • Like 1

All Stop. On Screen.

Link to comment
Share on other sites

any news on mantle by the way?

The words freedom and liberty, are diminishing the true meaning of the abstract concept they try to explain. The true nature of freedom is such, that the human mind is unable to comprehend it, so we make a cage and name it freedom in order to give a tangible meaning to what we dont understand, just as our ancestors made gods like Thor or Zeus to explain thunder.

 

-Teknoman2-

What? You thought it was a quote from some well known wise guy from the past?

 

Stupidity leads to willful ignorance - willful ignorance leads to hope - hope leads to sex - and that is how a new generation of fools is born!


We are hardcore role players... When we go to bed with a girl, we roll a D20 to see if we hit the target and a D6 to see how much penetration damage we did.

 

Modern democracy is: the sheep voting for which dog will be the shepherd's right hand.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...