Jump to content

mkreku

Recommended Posts

Anandtech has reviewed an early R9 290X:

 

http://anandtech.com/show/7457/the-radeon-r9-290x-review/

 

It basically beats or matches the Nvidia Titan in every test available, while starting at around $549 or so.

 

Now.. someone give me a few reasons why I don't need this card right now, because my brain stopped functioning at the sight of those lovely graphs.

  • Like 2

Swedes, go to: Spel2, for the latest game reviews in swedish!

Link to comment
Share on other sites

Couple reasons I can think of which may not be make or break:

 

1) The AMD stock cooler is an inadequate lump of plastic. Apparently all the initial launch cards are reference-only, so they're all identical - which is to say, identically hot. If you're going to remove it and go water then it's not an issue at all, of course, but if air-cooling, it's probably a good idea to wait for either custom cards or compatible third-party air coolers to be released (waterblocks are available).

 

2) The R9 290. If it's a repeat of the 7970 vs 7950 scenario, you could save another $100 and still overclock to within 5% of the performance. Reviews will be released when the NDA lifts on the 31st.

Edited by Humanoid

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Well if you consider the fact it uses us a WHOLE 30W more power than its competitors, you could argue that the cumulatively accelerated global warming would have those effects, yes. Two CFL lightbulbs worth!

 

Nvidia are launching the 780Ti in about a month too for what it's worth. They've already announced the price as being $650, so in that regard it's probably not going to be competitive unless it also trounces Titan (which surely would mean Titan being discontinued), or if they change the pre-announced price as a reaction. There's also a reasonable possibility that regular 780 drops to ~$500, but again, probably still behind value-wise. The common assumption is that 780Ti will be Titan with half the VRAM (3GB instead of 6GB) and with the extra compute stuff stripped out which would enable higher clocks.

 

Releasing second is always a win-some-lose-some proposition in that what you lose in initiative you gain by being able to react to the competitions benchmark numbers. In this case I assume nV will tweak the clocks *just* enough to make sure it beats the 290X by a couple percent. Bearing in mind that factory-overclocked regular 780s already beat Titan today and it's a reasonable guess.

Edited by Humanoid

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

I'm actually going to wait a month or so. I was in the store with my wallet in hand, but then I started talking to this cute girl (she works there) and she reminded me that I would have to live with the stock cooler because they're not getting the vendor's special cards until a month or so from now.

 

Hmm.. Maybe a Gigabyte Windforce.

Swedes, go to: Spel2, for the latest game reviews in swedish!

Link to comment
Share on other sites

I'm actually going to wait a month or so. I was in the store with my wallet in hand, but then I started talking to this cute girl (she works there) and she reminded me that I would have to live with the stock cooler because they're not getting the vendor's special cards until a month or so from now.

 

Hmm.. Maybe a Gigabyte Windforce.

Bingo. Stock cooling is why you don't want one of these cards just yet.

 

3nN9QYY.png

 

Even in "uber mode" the card was still throttling it's performance. At 100% fan speed it's >60db so it's pretty damn loud. Wait for the non-reference coolers at least.

 

Alternatively the other reason you might not want one is that it won't be compatible with nVidia's G-Sync variable refresh rate tech if that is of interest to you.

Link to comment
Share on other sites

A cute girl who works in the PC department and knows the cooling release schedule of bleeding edge components ... if she rides a motorcycle, too, I'm moving to Sweden tonight. 

 

If I had ball loads of cash, the 290X would be the first part I'd like to try water-cooling with. If that's even possible. Imagine silent Uber mode. 

All Stop. On Screen.

Link to comment
Share on other sites

OCUK tested the 290X with the god-king of air coolers installed. Temperatures were down by well over 30C, at essentially inaudible volumes. It's proof that the problems with the card aren't inherent to the chip itself, certainly, but also shows how much they cheaped out on the cooling.

 

It's ridiculous that with a 'reasonable' fan speed setting, the card with a stock cooler throttles down to ~850MHz. It's ridiculous that at those clocks, it beats Titan anyway. It's ridiculous that *overclocked* to 1100MHz, the custom cooler can keep the card stable at 62C. So oddly enough, in damning AMD for their silly decisionmaking here, we end up being even more amazed at the potential performance of this beast, which lest we forget together with its little brother and the 7950, renders the entire nVidia lineup pointless.

 

Another point though: factory coolers still aren't anywhere near as good as the best self-installed one. I bought an MSI Twin Frozr 3 version of the 7950, but I needn't have bothered - the noise was unacceptable regardless and I was forced to install Arctic Cooling's cooler (which is fantastic). Only cost me $10-20 more than the stock model though, so not that annoyed, and. (And besides, it meant getting a 7970 circuit board instead of a 7950 one: not particularly useful, but better is better) But I digress, I just mean to say that unless you're familiar with and are satisfied at the noise level of previous factory custom cards, it may be an idea to just buy now and install your own solution down the line, which will be better but probably $30-60 more expensive.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

A cute girl who works in the PC department and knows the cooling release schedule of bleeding edge components ... if she rides a motorcycle, too, I'm moving to Sweden tonight. 

 

If I had ball loads of cash, the 290X would be the first part I'd like to try water-cooling with. If that's even possible. Imagine silent Uber mode. 

 

Yep, water block was available immediately. Adds ~$100 to the total package price, bringing it up to GTX780 prices, but at the clocks you can achieve under water ...well, I'll an AMD marketer's silly hyperbole*, but it will "ridicule" the Titan.

 

 

* The original story was that he was asked about relative performance back before the NDA lifted, the (loosely paraphrased) response was that Titan was the performance target, it'd match or better it under normal use, and with Mantle-compatible games, it would RIDICULE the Titan. And predictably it's become somewhat memetastic in the tech community.

  • Like 1

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Where does this leave the regular 290. If the 290X is $549, and the 280X is $299, that's a wide gap. The 290 will be a Hawaii chipset, still, right, so $449 seems likely, especially if the 7970 also debuted at $549, I'm guessing the 290 will be like the 7950--in launch price, at least, not performance. I get the impression the 290X is only for 4K resolutions, while the 290 is for 27" monitors or 1200 and 1080. Are the 770 and the 290 supposed to be on par, I wonder.     

All Stop. On Screen.

Link to comment
Share on other sites

Where does this leave the regular 290. If the 290X is $549, and the 280X is $299, that's a wide gap. The 290 will be a Hawaii chipset, still, right, so $449 seems likely, especially if the 7970 also debuted at $549, I'm guessing the 290 will be like the 7950--in launch price, at least, not performance. I get the impression the 290X is only for 4K resolutions, while the 290 is for 27" monitors or 1200 and 1080. Are the 770 and the 290 supposed to be on par, I wonder.

 

According to the leaks, the vanilla 290 is about on par, if not slightly faster than the reference 780.

 

http://www.techpowerup.com/193205/radeon-r9-290-performance-figures-leaked-beats-gtx-780.html

http://videocardz.com/47294/amd-radeon-r9-290-non-x-gaming-performance-leaks

 

Rumoured price is between $399 and $449.

  • Like 1
Link to comment
Share on other sites

I would both hope and expect $450 for the 290, which indeed is a Tahiti with a few units disabled. And there's very little doubt that it'll be the best buy of any performance card until the 20nm generation hits in a year's time.

 

The 770 is completely irrelevant. Not just after these new launches, but that it always has been irrelevant - it's asking for $100 more than the 280X/7970GE, which it is beaten by. And that $100 when we're talking ~$300 cards means more than $100 when comparing ~$600 cards. Actually I'll go a step farther and say that with the 7950 being cleared out for less than $200, everything up to the $400 mark is currently irrelevant.

 

The 290's competition is the standard 780 (non-Ti). The 780 will either take a price cut from its current $650 SRP (since the 780Ti has been announced at that price), or it will be discontinued, so who knows - but it's hard to see nV dropping it to $450 all the same.

 

That said, I don't think the cards are "meant" to be used for any particular resolution. The superiority at higher resolutions is a quirk of AMD's design (probably the 512-bit bus width primarily), but that means the 290 will have a proportionally identical gain when moving up to 4k. And while they do passably at that resolution, I don't think any graphics card in the world can be said to perform well enough for 4k gaming. I mean check out Anandtech's numbers in the OP for example. It takes *two* 290X cards to hit 60.0fps in Crysis 3, at medium quality and FXAA (cheap AA). Two Titans, for what it's worth, manage only 37fps. 4k is just flat out not game-ready.

  • Like 1

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Awesome card, price-, performance- and featurewise. It even removes the woes of Crossfire  :) The cooler is a shame though, and the new Powertune will need testing on the user side to getting used to - meaning that setting a temp target too low may cause the GPU to throttle, if you don't allow the cooler to spin up high enough etc.

 

I'm happy with my current setup, but the next gen of nVidia and AMD cards - I won't be buying anything new before then - will be even more interesting due to AMD's ability to keep up at the high-end again :)

 

Also... I want TrueAudio to succeed. Ideally with peripherals, i.e. discreet DSP-cards, to allow nVidia- or older-generation-AMD users to get the benefit of truly well computed sound as well... ah, a man can dream... -_-

Edited by samm

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

Ah, well it was the [H] guys that made me think the 290X was designed for 4K:

 

 

Amazingly, when the resolution is cranked up to Ultra HD resolutions like 3840x2160, the new Radeon R9 290X outperforms the GeForce GTX TITAN. We saw as much as a 23% performance jump over the much more expensive GeForce GTX TITAN. In all of the previous pages the R9 290X was in-between the GTX 780 and GTX TITAN, closer to TITAN, but at Ultra HD 4K it just owns the GeForce GTX TITAN.

 

 

And this is all with DX11. Now I wonder if Mantle is as low-level as they say. If something as simple as fan speed on the 290X can deliver double-digit improvement, what sort of speed could we see from a pedal-to-the-metal API. 

All Stop. On Screen.

Link to comment
Share on other sites

750W is of course ridiculous. But the power consumption *is* high, in the > 200W range in games, at least with stock cooling. If cooled to better temperatures with an Accelero or watercoled, it should be using quite a bit less of juice.

The problem is that the projection of both IHVs did not work out regarding the progress in manufacturing tech - they have been releasing GPUs on the same process node for several generations now, and are just pushing it to the limit.

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

Cypress/Northern Islands and Fermi were both 40nm, so the node cadence isn't really that unusual: 28nm debuted with the 7970. But the GPU manufacturers now face competition for access to fabs from the mobile SoC devices, and when faced with the choice of selling to nV/AMD or to Apple (who are looking to shift most of their iPhone silicon away from Samsung for obvious reasons), well, you can guess who has the better influence there.

 

And yeah, while power consumption for a single-GPU card is, I believe, second only to the maligned GTX480, I'd still say it's well within reason. Indeed at with both CPU and GPU at stock clocks, I'd be okay running it off a ~450W PSU. It is, after all, only ~10-20% more than where the previous gen ended up powerwise.

  • Like 1

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

You're right. Regarding the timeline, my memory was really inaccurate. The HD 4770, containing the first 40nm GPU architecture, debuted in early 2009, and Cayman as the last one debuted at the end of 2010, so a good two years between them. Almost the same time span between Tahiti and Hawaii.

Edited by samm

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

I've wondered with high end high power consumption cards and the commonplace of CPU or board based graphics why someone hasn't though of making a card that could be turned on and off like a harddisk, working with light load and 3dmode as well as a manual toggle. There would probably need to be a motherboard standard as well. Maybe there already is ?

Na na  na na  na na  ...

greg358 from Darksouls 3 PVP is a CHEATER.

That is all.

 

Link to comment
Share on other sites

It exists, but only really implemented for mobile platforms. Check nVidia's Optimus which is advertised on most notebooks, and there's an AMD equivalent whose name escapes me - though it's not quite as mature. Hopefully it makes the leap soon, but then again, AMD's performance CPUs still don't have integrated graphics, and neither do Intel's -E CPUs.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Wasn't there third party that tried to implement that with high power cards and the intel integrated gpu platform? The problem was that if you connected your monitor to the graphics card you couldn't use it and if you connected it to the integrated card you could use the feature, but you got throttled by the lower performing components in the integrated graphics.

 

So it lowered performance a fair bit, to the point of it being pontless.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...