Jump to content

Nvidia RTX Series


Bokishi

Recommended Posts

10 hours ago, Zoraptor said:

The alleged and never announced 16/20 GB versions of the 3070/80 have equally allegedly been cancelled. Cue lots of wags asking if the 10GB version of the 3080 is still due to be released sometime or not...

The rumour of the cancellation of a rumoured product is mostly interesting because of the other rumour floating around, that nVidia wants to do a TSMC based refresh of 3000 series already. Not that extra memory made much sense anyway except to increase thermals even more unless you were a content creator type, in which case nVidia would probably prefer you to buy a 3090 or a pro card anyway.

I might have to cancel my theoretical order of a hypothetical 3080 then :(

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

I never believed there would be a 20GB version of the 3080, so far you rarely need over 6GB for 4K, and the few games that go over 8GB it's so minimal with no real graphical benefits, devs could easily get it under 10GB. The 16GB 3070 being GDDR6 was possible, they've done 1/2GB, 2/4GB, 3/6GB versions of cards before, 10/11GB makes more sense in terms of use case.

  • Like 1
Link to comment
Share on other sites

12GB would have been a sensible compromise for the 3080. I hit more than 6GB VRAM used for 1440 ultrawide on games like Metro Exodus (not on the top settings either), and Watch Dogs Legion's top spec calls for 11GB- and it does look a bit bad that there is already a game that the 3080 is, at least technically, below top spec for. That would have required a bigger bus or 970 type compromise though, and the 3080 is already a large chip on quite tight margins with regards to heat.

1 hour ago, Azdeus said:

I might have to cancel my theoretical order of a hypothetical 3080 then :(

I wouldn't advise it at least yet. The TSMC switch is purely theoretical, and the release would be probably June-ish if they'd made the decision today. Samsung 8nm and TSMC aren't design compatible, the only Ampere chip we know nVidia has that is ready (and already in production) for TSMC is GA100 which isn't consumer and the spare capacity at TSMC from Huawei/ Apple has already been taken up, ironically mostly by AMD but they'd also be likely to be competing for space with Intel too by that time.

Link to comment
Share on other sites

I don't think Metro Exodus uses over 6GB VRAM on 1440 ultrawide, at least it doesn't in 4K extreme in benchmarks, it might allocate over 6GB VRAM, but that's usual because a lot of games allocate 1 or 2GB above what they use. Watch Dogs Legion comes with 3080, so when I finally get a Zen 3 and complete my built I'll find out whether it uses over 10GB, but I sincerely doubt it. Technically Doom Eternal is one of the few games that uses over 8GB, but people on 8GB GPUs wouldn't be able to tell you that, because it doesn't effect gameplay or fidelity. There will come a time when the top graphical settings in AAA games will be above 10GB or 11GB, but that time is probably '22/23. The reason for 10GB on the 3080 I'm assuming is because the memory is expensive.

  • Like 1
Link to comment
Share on other sites

FFXV with 4k "texture pack" and its TRAM setting at highest is a constant 9.5-10.5 vram used (my 2080ti has 11gb).  If you have 8gb/less I wouldn't recommend using Highest on that game, no matter what resolution, since you'll get more stutter/frame pacing issues - although if you lower tram too much you then get much more noticeable texture pop-in "lag" (this w/the game installed on SSD). I also suspect Death Stranding could/would be up there, but for some reason they either purposefully or unintentionally (bug) have some kind of low cap on vram so it only detects/says I have/uses around 5gb max despite my having more than twice that. Which is annoying. I think going forward more games need settings re: how much vram you want utilized for certain things like texture loading or whatnot.

But yeah, that's not typical of most games at the moment. They "cap" or simply don't need that much vram most of the time.

Edit: I should buy Metro Ex. on GoG when it's on sale someday.  I don't' care about playing it but I'm still curious re: performance. lol

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

FFXV definitely doesn't need over 8GB to look the way it does, it's partly down to how the engine works, and also that it wasn't developed around such high textures. Some flight simulators also like to load a ton of textures into VRAM. Death Stranding is just a much better optimized game, it doesn't need more than 6GB. With any type of RAM, it's better to use all of it, but often it's a sign of poor development practice, if there's little visible benefit. I wouldn't be surprised if FFXV is patched to compensate for the 3080's 10GB as it is a Nvidia sponsored title, with no visual difference for the player.

For a long time there's been mods, especially for Bethesda engine games, that use an incredible amount of VRAM. They don't look good compared to modern AAA games, but due to the way the game is optimized, the high resolution textures chew through VRAM. Modders don't have the development skills, or the access to the engine, to optimize their assets.

Nvidia actually analyzed a lot of AAA games, including Metro Exodus, and found they only use 4-6GB VRAM. Hopefully there's also Direct Storage/RTX IO that will allow games to swap textures much faster, allowing games to use much higher textures with less VRAM. I hope they also develop more procedural and compression tech, because for me 65GB texture packs often aren't worth it for the marginal upgrade. The benefits of playing games on SSD out weighs higher textures. I have about 7TB of SSD storage, but it's not all dedicated to gaming, and I'm probably above average.

Link to comment
Share on other sites

58 minutes ago, AwesomeOcelot said:

FFXV definitely doesn't need over 8GB to look the way it does, it's partly down to how the engine works,

Oh, I agree. And the game generally looks/behaves fine on High tram vs. Highest as long as the general "power" of your machine is up to par.
A lot of that 65Gb is from the higher res videos than textures, tho.  Every single movie/pre-rendered in-game cutscene, the "4k" pack version is at least 3 or 4 times bigger file size (some are 1GB)  Which imo is worth it because they look like crud on a big screen without them. :)

58 minutes ago, AwesomeOcelot said:

Death Stranding is just a much better optimized game, it doesn't need more than 6GB

I'd believe that more if the settings menu didn't look like it was capping and it just "naturally" played without going over 5Gb - plus the not detecting my vram properly in the first place. Edit: it doesn't even allow you to have ... I think it was AF ... on properly.  You can do it via forcing in gpu program but it's not done properly via game-settings.  Distance looks a fair bit worse without doing it.
dsgmem.jpg

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

That looks like how games allocate VRAM. So 4.5GB is how much it's using, and 9.2GB is allocation size. The game only allocates 7.1GB if you're on a 2080 that only has 8GB. Death Stranding is not the only game that allocates less VRAM than what the GPU has available. TechPowerUp has VRAM usage at 4.9GB on a 2080 Ti.

Edited by AwesomeOcelot
Link to comment
Share on other sites

Regardless, I'd like higher resolution texture packs all the time with more/higher vram usage options. Or whatever.  I'm tired of low res looking textures that only look good from "far away."  😛

Oh, also, I forgot:

1 hour ago, AwesomeOcelot said:

I wouldn't be surprised if FFXV is patched to compensate for the 3080's 10GB as it is a Nvidia sponsored title, with no visual difference for the player.

Never happen, since the game does work fine on "high", one doesn't have to use Highest.
...well, I could MAYBE see them doing something if they made/released an "improved" "complete-complete version for ps5 and maybe then finally adding dlss2+ and other tweaks to PC version while they're at it.  Otherwise ... nah.  They're totally done with it.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

Since we're in a new generation of consoles with double the RAM, and potentially 12GB VRAM, we will start seeing far more games with higher textures. Also Direct Storage/RTX IO might change the way games are developed, high res textures streaming in and out, not so much pop in. I think the majority of developers didn't because 10GB plus graphics cards are probably around 2% of the market. If you take out LAN centres and laptops, it might be 5%. Almost all AAA developers had 2 other platforms to think about with 8GB shared memory, which usually meant max 6GB for VRAM.

TechRadar noted that they couldn't get a 2080 Ti to maintain 60fps at 4K max settings, but the 3080 can. I don't tend to play games under 60fps.

Edited by AwesomeOcelot
Link to comment
Share on other sites

17 minutes ago, Keyrock said:

The 3080 just came out and there's already panic that it's insufficient?

🤣

Naturally. You can never have enough RAM, VRAM, or MHZ.  😛

 

 

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

3 minutes ago, Keyrock said:

The 3090 is right there, it's got 2.4X the VRAM for only 2X the price. Think of the value! :shifty:

And only 10% better performance than the 3080 overall ;)

Video Cards are weird like that, numbers don't always mean better.

What bothers me the most actually is when they advertise something as a "1440p card" yet the cards that are advertised as "4k" do better at 1440p than the "1440p card" does.  I'm sorry, if you advertise something as such, it should perform better at that resolution than any other card.

Link to comment
Share on other sites

1 hour ago, Keyrock said:

The 3090 is right there, it's got 2.4X the VRAM for only 2X the price. Think of the value! :shifty:

Value, schamlue, the only thing that matters is higher numbers always look smexy.
Kind of like all that pointless lightsaber spinning. :thumbsup:

  • Like 1
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

14 hours ago, Zoraptor said:

I wouldn't advise it at least yet. The TSMC switch is purely theoretical, and the release would be probably June-ish if they'd made the decision today. Samsung 8nm and TSMC aren't design compatible, the only Ampere chip we know nVidia has that is ready (and already in production) for TSMC is GA100 which isn't consumer and the spare capacity at TSMC from Huawei/ Apple has already been taken up, ironically mostly by AMD but they'd also be likely to be competing for space with Intel too by that time.

Ofcourse, was just hopping on the bandwagon for a bit.

If the AMD Radeon releases to impressive enough ray tracing benchmarks, and I still haven't recieved the 3080, then I'd probably consider switching.

Redesigning their gpu for an incompatible node in 8 months sounds impressive. I'd wager that if they got any wafers from TSMC at all, the redesign would take atleast a year.

 

Edited by Azdeus

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

I'd say that Jun/Jul 2021 is at least plausible. I certainly find the rumours of bulk production at TSMC in 2021 to be rather unlikely given the sheer volume of wafers bulk production for nVidia implies, but I could see the top SKUs there by the middle of next year if they've moved to get capacity quickly enough- and Jensen hasn't worked his unique interpersonal magic.

I'd suspect a lot of work had already been done for a potential TSMC release. GA100 is already there, and as was discussed here fairly extensively at the time nVidia was stating outright that TSMC would be used for some consumer units up until fairly recently. The initial chip lineup for Ampere was supposed to be GA100/2/3/4, we currently have A100 with GA100, 3090/80 with GA102 and 3070/60? will be GA104, so there is a potential 'missing' GA103. Personally, I would not be in the least bit surprised if GA102/3 were TSMC/ Samsung versions of the same basic chip with the TSMC one being shelved due to no capacity being available, or Samsung offering a deal too good to be refused.

Link to comment
Share on other sites

10 hours ago, Keyrock said:

The 3080 just came out and there's already panic that it's insufficient?

🤣

It started before it was even out. Nevermind that AAA games use 4-6GB now. People were panicing over not being able to run their gigantic veiny Skyrim mods. They need the 8K textures to get right up close to appreciate the detail.

  • Haha 2
  • Gasp! 1
Link to comment
Share on other sites

3070 reviews are out. A little less than what Nvidia suggested, but still close to the 2080 Ti at 4K, perhaps memory bandwidth limiting 1440p and under. RTX performance with a bit of an increase over the 2080 Ti, so it performs within a few percentage points of it in RTX enabled games, slightly under it in games without RTX. The 2080 Ti is factory overclocked as well. AIB 3070's are almost certainly going to slightly better overall than the 2080 Ti.

Link to comment
Share on other sites

On 10/23/2020 at 3:00 AM, Azdeus said:

If the AMD Radeon releases to impressive enough ray tracing benchmarks, and I still haven't recieved the 3080, then I'd probably consider switching.

It won't.  Leaked Port Royal benchmarks are suggesting It can't even come within margin of error of DLSS + RTX performance.  There may be other reasons to buy Radeon, Ray Tracing performance isn't one of them.

Link to comment
Share on other sites

1 hour ago, ComradeMaster said:

It won't.  Leaked Port Royal benchmarks are suggesting It can't even come within margin of error of DLSS + RTX performance.  There may be other reasons to buy Radeon, Ray Tracing performance isn't one of them.

Synthetics mean absolutely nothing. The Vega 7 did extremely well in synthetics for instance, but was mediocre in real gaming.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Synthetics don't mean nothing*, but certainly have to be taken with a large grain of salt. You'd have to question whether a benchmark developed for- essentially- a nVidia branded technology is going to reflect reality or whether it has the RT equivalent of 64x tesselation applied.

*The classic example is probably Zen2, where the increased cache completely invalidated some benchmarks because the benchmark would literally run from cache. Radeon 7 was a debadged/ non certified Instinct card, so it did incredibly well at benchmarks that relied on compute- or memory bandwidth since it had 4 stacks of HBM2/ ~1TB of bandwidth.

  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...