Jump to content

Recommended Posts

Purportedly much less powerful than the green monsters offerings though.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites

The nextbox slides claim 9x the GRays as nVidia, though one suspects they aren't actually measuring the same thing (peak vs average being the obvious one) and the 3-10x the equivalent of non accelerated RT performance is the more valid claim. IIRC that's around about the performance gain of the first set of RTX cards; so not great, but not terrible.

AMD's raytracing implementation should be quite interesting since, presumably, the Sony side won't be using a DXR implementation unlike xbox and (effectively) nVidia.

Edited by Zoraptor
  • Thanks 1
Link to post
Share on other sites
11 hours ago, Keyrock said:

Supposedly, xXxBOX Series xXx has hardware ray tracing and some manner of super sampling. It stands to reason that PS5 and Big Navi will feature these also, given that they are all RDNA2.

Yup it's obvious that AMD doesn't care about toppling Nvidia seeing as how they can marginally upgrade their GPU's to 60 or so FPS with Ray Tracing titles and the fact that their cards will be in both consoles.  I mean what more could they want?

Expect to see this Ray Tracing stuff being overused and murdered to death with next gen consoles and likewise PC titles.

How many times has a new graphics feature been pimped out so rigorously in marketing?  Kinda weird but I'm no expert on gaming trends.

"America would be unrecognizable if it had ordered the separation of corporation and state like it orders separation of church and state."

Link to post
Share on other sites

Hardware T&L is comparable, in the way they were both software before getting specific hardware, although real-time ray tracing was never popular, where as T&L was everywhere. Fully programmable vertex and pixel shaders would be comparable in how much of an impact it will have on how graphics are made, but is less comparable in how it the hardware supports it.

Async compute is the last time a hardware feature was pimped so much. There was some impressive performance gained, especially with lower end CPUs, but it didn't really change graphics.

Edited by AwesomeOcelot
  • Thanks 1
Link to post
Share on other sites
13 hours ago, Zoraptor said:

The nextbox slides claim 9x the GRays as nVidia, though one suspects they aren't actually measuring the same thing (peak vs average being the obvious one) and the 3-10x the equivalent of non accelerated RT performance is the more valid claim. IIRC that's around about the performance gain of the first set of RTX cards; so not great, but not terrible.

AMD's raytracing implementation should be quite interesting since, presumably, the Sony side won't be using a DXR implementation unlike xbox and (effectively) nVidia.

Well, the RT and texture units share resources, so the effective capacity is going to be much lower. So... :<

 

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites

Supposedly, the 3090 draws an astonishing 400W of power. No wonder they put that beefy AF heat sink on the FE. Anything that sucks that much power is going to put out huge amounts of heat, no way around that.

That would be pushing the 750W PSU I bought for the build, even if it is a high quality 750, especially with enthusiast OCing. Of course, I'm not shopping in the $1400 segment anyway, so it's a moot point.

Edited by Keyrock

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites

I hate heat and noise, so I usually go for whatever mid-end card is the most power and price-performance efficient, especially since I'm finding current "realistic" 3D graphics more and more hideous. I'm hoping AMD's new series will be slightly more inspiring than nvidia's. Most likely, I won't end up buying a card this generation anyways, but I'd prefer to at least be able to recommend AMD cards to others this time around...

Edited by Bartimaeus

Put fascists and sociopaths on your ignore list.

Quote

Never believe that anti-Semites are completely unaware of the absurdity of their replies. They know that their remarks are frivolous, open to challenge. But they are amusing themselves, for it is their adversary who is obliged to use words responsibly, since he believes in words. They even like to play with discourse for, by giving ridiculous reasons, they discredit the seriousness of their interlocutors. They delight in acting in bad faith, since they seek not to persuade by sound argument but to intimidate and disconcert. If you press them too closely, they will abruptly fall silent, loftily indicating by some phrase that the time for argument is past.

 

Link to post
Share on other sites

I know that the 3090 isn't meant for the general market, it's for the Bokishis of the world, but 400W is pretty extreme, even for an ultra luxury card. I don't think any of the infamous Fermi space heaters (fun fact: I had a GTX 460) had a power draw that high.

It could all be hogwash, though. I guess we'll know for sure in about a week.

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites
7 hours ago, Keyrock said:

I know that the 3090 isn't meant for the general market, it's for the Bokishis of the world, but 400W is pretty extreme, even for an ultra luxury card. I don't think any of the infamous Fermi space heaters (fun fact: I had a GTX 460) had a power draw that high.

The formal TDP of the GTX 480 was 'only' 250W and the recommended PSU was (a quality) 600W. OK, Seasonic is a PSU manufacturer and has an interest in selling more powerful and profitable units, but their's is the only rec we have for the '3090' and it is 850W.

When it comes to power it probably does need to be pointed out that the pretty efficient for its size/ computing power at stock 2080Ti can go up to ~600W itself with the Kingpin watercooled version.

Quote

It could all be hogwash, though. I guess we'll know for sure in about a week.

I have to admit I've got rather skeptical about a lot of the leaks.

I was skeptical when the initial claims were 50% more performance and 50% less power draw because that was TSMC's node improvement claim with the or replaced with an and; and I'm skeptical now that the claims seem to imply 50% better performance but more than 50% greater power draw as that is literally no improvement on the power/ performance level (excluding tensor).

Link to post
Share on other sites
On 8/22/2020 at 3:44 AM, Keyrock said:

So, theoretically this is the RTX 3090 Founder's Edition:

The price tag floating around the rumor mill is a STAGGERING $1400.

Hmm... this is almost all I was thinking of spending if I were to build an new PC........ 😅

Link to post
Share on other sites
58 minutes ago, AwesomeOcelot said:

3090 seems like a deep learning card. Probably wouldn't even be that much better for gaming than a 3080 and certainly not good price/performance wise.

But imagine the e-peen.

  • Like 1
Link to post
Share on other sites

'3090' being some sort of 'failed'/ non certified pro card like the RadeonVII (16GB HBM2 so also massive overkill for consumer) was to the MI60 might be possible. Recent rumour is also of a 20GB '3080+' which is kind of weird to do in a vacuum but makes sense for 'failed' '3090's. Personally at this point I suspect the high memory cards may have a very large relative increase in RT cores to try and get 4k raytracing at high framerates; with '3070' and lower still having raytracing at more gimmick level.

They're also supposed to be on a 7nm node, again. I'd presume Samsung 7nm since the TDP estimates have been absolutely consistent and are hard to reconcile with TSMC's 7nm (plus zero taping out rumours for TSMC).

Oh well, guess we'll find out in a couple of days.

 

 

Edited by Zoraptor
Link to post
Share on other sites

If the rumored specs are real than the 3090 will definitely be faster than the 3080 just on the basis of ~1000 more CUDA cores. I agree that 24 GB of memory is overkill for gaming, though. Developers tend to optimize for the lowest common denominator and both xXxBOX Series xXx and PS5 have 16 GB of memory, but that's shared between CPU and GPU. 10 or 12 GB of VRAM should be plenty for the next 6 years or so. I'm sure there will be a game here or there that takes advantage of more, but, for the most part, you're going to get very diminishing returns for anything above 10 GB. Heck, 8 GB will probably be plenty for most people. 24 GB, in terms of gaming, will be useful for Bokishi running on a 16K monitor and not a lot of other people.

Edited by Keyrock

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites

That's supported too by the next tier card having 'only' 10GB.

(I would guess that the actual reason for it having 24GB is that it really is a Titan replacement, and Titan RTX had 24GB. Ironically, that would make the card a significant improvement in value compared to its predecessor even at 1400usd+. If it isn't... then something like a lot of extra RT processing would have to be there. Otherwise it's just burning money- not so bad when you charge what you want- and adding even more power overhead. While the 16k remark may have been facetious the problem at higher resolutions would be that while the RAM might be better utilised the card would struggle to push enough frames, especially raytraced)

Edited by Zoraptor
Link to post
Share on other sites

Stumbled on this on Reddit;

3090

https://www.gainward.com/main/vgapro.php?id=1090&tab=ov&lang=en

3090 "GS"

https://www.gainward.com/main/vgapro.php?id=1089

3080

https://www.gainward.com/main/vgapro.php?id=1088

3080 "GS"

https://www.gainward.com/main/vgapro.php?id=1087

The datasheet says "7nm" aswell, though not exactly which node it is.

Incase they remove it;

bild.thumb.png.b385b272ecf764155b856103cfdb2b26.png

Edited by Azdeus

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites

Jensen has an amazing kitchen. 

3080 and 3070 are cheaper than I expected, still ludicrously expensive, mind you, but less so than what I feared.

I look forward to independent benchmarks.

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites

AMD better pull a rabbit out of their hat, but I'm really looking at that 3080 now. I'd prefer a 3080 with more memory though, might cave in and buy a 3090.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites

If past nvidia releases are anything to go by, the 3090s will probably be out of stock long enough that you'll get news on the AMD series in a couple of months before you have a realistic shot of buying them.

Edited by Bartimaeus

Put fascists and sociopaths on your ignore list.

Quote

Never believe that anti-Semites are completely unaware of the absurdity of their replies. They know that their remarks are frivolous, open to challenge. But they are amusing themselves, for it is their adversary who is obliged to use words responsibly, since he believes in words. They even like to play with discourse for, by giving ridiculous reasons, they discredit the seriousness of their interlocutors. They delight in acting in bad faith, since they seek not to persuade by sound argument but to intimidate and disconcert. If you press them too closely, they will abruptly fall silent, loftily indicating by some phrase that the time for argument is past.

 

Link to post
Share on other sites

 

  • Like 1

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites
44 minutes ago, Bartimaeus said:

If past nvidia releases are anything to go by, the 3090s will probably be out of stock long enough that you'll get news on the AMD series in a couple of months before you have a realistic shot of buying them.


Maybe, the store where I buy my stuff has always had good stocks of GPUs though.

 

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...