Jump to content

Nvidia RTX Series


Bokishi

Recommended Posts

6 hours ago, Keyrock said:

For a workstation Titan-class card, I could see that. HBM2 for a gaming card is a mistake IMHO, a mistake AMD has already made previously and one Lisa Su would likely not repeat. HBM is just too prohibitively expensive for an increase in bandwith that's not meaningful enough (for gaming) at this point to justify the extra expense. For compute, on the other hand, HBM is great.

HBM for a main sequence consumer card would be, though to be fair it isn't that expensive. But HBM gets rid of one major potential problem Biggest Navi has with GDDR6- memory bus constraints- while reducing wattage. 1TB bandwidth might be overkill for the RadeonVII, but may well be justified with raytracing added. The alternatives are a 512 bit GDDR6 bus (too hot, too large) 'crippled' 970/ console like VRAM configs andor adding a lot of S/EDRAM cache, which is also very expensive.

If they can get a decent top end performance boost from HBM and competitive top end raytracing I think they'd go for it.

  • Like 1
Link to comment
Share on other sites

47 minutes ago, Zoraptor said:

HBM for a main sequence consumer card would be, though to be fair it isn't that expensive. But HBM gets rid of one major potential problem Biggest Navi has with GDDR6- memory bus constraints- while reducing wattage. 1TB bandwidth might be overkill for the RadeonVII, but may well be justified with raytracing added. The alternatives are a 512 bit GDDR6 bus (too hot, too large) 'crippled' 970/ console like VRAM configs andor adding a lot of S/EDRAM cache, which is also very expensive.

If they can get a decent top end performance boost from HBM and competitive top end raytracing I think they'd go for it.

I have heard rumbling about Big Navi having a very sizable cache as a way to address the bandwith discrepancy between GDDR6 and GDDR6X.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Apparently some 3080 cards have problems crashing when they hit high boosts. Allegedly the EVGA card I still have on pre-order. Good job I have a FE on order now. Which was also over $100 cheaper.

3 hours ago, LadyCrimson said:

Hubby has an expensive 55" 4k he's using for work (instead of 4 monitors like he was doing). It sits on his desk and when I sit in front of it for a while, it doesn't seem too big at all

Yep, I almost always have 2 20" 1080p windows top left and right, and a ultra wide window at the bottom. It works as 4 1080p monitors really well, that was the plan as I was replacing a multi-monitor setup. I've had a multi-monitor set up since 1996. It's not quite as good at single window workflow, as a small monitor would be, it's just OK, but I can imagine the problems being exaggerated with a monitor twice the size.

Link to comment
Share on other sites

I want a monitor set that opens up like a dartboard cabinet, with an ~30" screen in the middle and two more-or-less square displays inside the "door" panels. Running 3x27" right now but I really neither need nor want the screens on either side to be that wide as they're not used for displaying games (with rare exceptions). But matching not just the height but also the pixel density with various monitors of different sizes right now is nigh-impossible so I have to live with just having three identically sized ones.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

2 hours ago, Keyrock said:

I have heard rumbling about Big Navi having a very sizable cache as a way to address the bandwith discrepancy between GDDR6 and GDDR6X.

Yes, though that also has problems since a big cache takes up lots of extra on die space, though much like HBM it should also help a lot with raytracing. The trouble is that a lot of the leaks are mutually incompatible; even with a lot of SRAM 16GB on a 256 bit bus is... unbalanced. HBM would solve all their problems though, not on die, reduced wattage, increased bandwidth; and this sort of situation is what it was intended for.

I'll freely admit I'm in a minority expecting it though.

Link to comment
Share on other sites

50 minutes ago, Humanoid said:

I want a monitor set that opens up like a dartboard cabinet, with an ~30" screen in the middle and two more-or-less square displays inside the "door" panels. Running 3x27" right now but I really neither need nor want the screens on either side to be that wide as they're not used for displaying games (with rare exceptions). But matching not just the height but also the pixel density with various monitors of different sizes right now is nigh-impossible so I have to live with just having three identically sized ones.

Would a super ultra wide be the equivalent?

Link to comment
Share on other sites

15 minutes ago, AwesomeOcelot said:

Nah, I'm happy with 16:9 as the primary gaming area and I suspect ultrawide screens (though I've never actually used one before) would just introduce more hassle as I don't think contemporary OSes can treat them the same way it would three separately addressable displays. I still want my games full-screen, with adaptive sync and all that, but without interfering with Windows desktop space either side.

In the end, I just want somewhere to put my browser window (which I put on my right screen) and various other programs like file explorer windows, my music player, email client (on the left) and the natural breaks between separate screens is the most straightforward and intuitive solution to the problem.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

3 hours ago, Keyrock said:

I have heard rumbling about Big Navi having a very sizable cache as a way to address the bandwith discrepancy between GDDR6 and GDDR6X.

 

1 hour ago, Zoraptor said:

Yes, though that also has problems since a big cache takes up lots of extra on die space, though much like HBM it should also help a lot with raytracing. The trouble is that a lot of the leaks are mutually incompatible; even with a lot of SRAM 16GB on a 256 bit bus is... unbalanced. HBM would solve all their problems though, not on die, reduced wattage, increased bandwidth; and this sort of situation is what it was intended for.

I'll freely admit I'm in a minority expecting it though.

The things I've read about that "big cache" is that it's a misreading of "Mb" and "MB", which does lend the rumor some credence. It wouldn't take up much of the die.

1 hour ago, AwesomeOcelot said:

Apparently some 3080 cards have problems crashing when they hit high boosts. Allegedly the EVGA card I still have on pre-order. Good job I have a FE on order now. Which was also over $100 cheaper.

Yep, I almost always have 2 20" 1080p windows top left and right, and a ultra wide window at the bottom. It works as 4 1080p monitors really well, that was the plan as I was replacing a multi-monitor setup. I've had a multi-monitor set up since 1996. It's not quite as good at single window workflow, as a small monitor would be, it's just OK, but I can imagine the problems being exaggerated with a monitor twice the size.

Igor had some information about it being related to the power modules on the AIB cards, where some of the partners decided to go with much cheaper POSCAP capacitors, but then signal noise causes instability at higher clocks. The ASUS 3080 TUF that I ordered has 6/6 of the more expensive MLCC caps

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

12 minutes ago, Humanoid said:

Nah, I'm happy with 16:9 as the primary gaming area and I suspect ultrawide screens (though I've never actually used one before) would just introduce more hassle as I don't think contemporary OSes can treat them the same way it would three separately addressable displays. I still want my games full-screen, with adaptive sync and all that, but without interfering with Windows desktop space either side.

They should treat 16:9 the same as a normal screen, just not use 2 portions either side (on most monitors I've used), the same way 4:3 would display on a 16:9 display. VDM can give you virtual displays on the same monitor.   Haven't used it personally though, I use Divvy to divide my screen. Some TV's have picture in picture, I haven't tried it, but mine does support it, and it would be cool to output from 2 ports on my graphics card to the same monitor.

Link to comment
Share on other sites

29 minutes ago, Azdeus said:

Igor had some information about it being related to the power modules on the AIB cards, where some of the partners decided to go with much cheaper POSCAP capacitors, but then signal noise causes instability at higher clocks. The ASUS 3080 TUF that I ordered has 6/6 of the more expensive MLCC caps

My FE should have 2/6 expensive, and the EVGA had 1/6 (just saw some are 2/6). The ones effected most are the 0/6. I hope it's a simple fix for people, just limit the boost clocks. I think people will make a massive deal out of nothing like the 970 memory, and the "Zen 2" (not sure which Ryzen tbh) boost clock controversy. We don't even know how widespread the problem is though.

Edited by AwesomeOcelot
  • Hmmm 1
Link to comment
Share on other sites

16 minutes ago, AwesomeOcelot said:

My FE should have 2/6 expensive, and the EVGA had 1/6. The ones effected most are the 0/6. I hope it's a simple fix for people, just limit the boost clocks. I think people will make a massive deal out of nothing like the 970 memory, and the "Zen 2" (not sure which Ryzen tbh) boost clock controversy. We don't even know how widespread the problem is though.

Yeah, it's impossible to know really since we don't actually know how many people have card in their hands. Jayz2cents pointed out that there might be people where the cards can't reach any high boost clocks aswell, ie bad case or bad case ventilation, just living in a place that is to hot to allow the boost to go high enough, and such people aren't going to notice the problem.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

  • Haha 1

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Quote

During our mass production QC testing we discovered a full 6 POSCAPs solution cannot pass the real world applications testing. It took almost a week of R&D effort to find the cause and reduce the POSCAPs to 4 and add 20 MLCC caps prior to shipping production boards, this is why the EVGA GeForce RTX 3080 FTW3 series was delayed at launch. There were no 6 POSCAP production EVGA GeForce RTX 3080 FTW3 boards shipped.

EVGA 6 POSCAPs were review cards only, they didn't sell any. So I think all ASUS, FE, and EVGA cards should be good. 

Edited by AwesomeOcelot
  • Like 1
Link to comment
Share on other sites

Glad to know, if I was in the market for a 30xx, that I wouldn't be the only foolish idiot wanting a 3090.  Not that I'd wait in line for one. I prefer waiting six months (or whatever).  To me buying gpu's (or any expensive tech item) is like buying games, now.  You wait for the patches/upgrades first.  hahaha...

Edited by LadyCrimson
  • Like 2
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

DLSS 2.1 is insane, not necessarily useful to anyone outside of 8K, but damn interesting what it can do to 240p content. If I was on a budget, with a 3060, I might consider 1080p -> 2160p DLSS ultra performance as a viable option over 1440p. I can't really see any benefit to DSR super sampling combined with DLSS ultra performance compared to just using balanced and quality DLSS, it's just a obsolete better than TAA solution.

  • Like 1
  • Thanks 1
  • Hmmm 1
Link to comment
Share on other sites

^I'm actually playing Control at DLSS 960p -> 1440p with Max settings, RTX off, and using a 2080 Super and getting around 110 fps on average, and it looks quite good!  Medium settings gets it right to my 140 fps 'sweet spot'

Incredible considering 'Control' is a massive system hog.

Edited by ComradeMaster
Link to comment
Share on other sites

Check it:

NqtCATU.jpg

QQ14Mu4.png

 

The top one is 1440p, the bottom is DLSS 960p ->1440p.  Both on medium settings no RTX.

To me, the picture quality doesn't look like a significant difference.  However, notice the huge gains in FPS (top left corner, white numbers) with DLSS, which IS a significant difference.

This is a basic test sample, and would require extensive testing to get an accurate visual comparison, but I think this is sufficient for demonstration purposes.

Edited by ComradeMaster
  • Thanks 2
Link to comment
Share on other sites

DLSS 2.1 definitely has uses below 8k.  And it's tons better than early DLSS was. Doesn't oversharpen excessively, for example. I might even use it occasionally for 4k down the road, when I start feeling like newer games are getting too under 60fps for my personal settings/liking etc, before the 40xx series arrives.

Edited by LadyCrimson
  • Like 2
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

I haven't seen any improvements on DLSS 2.1 over 2.0. The features of DLSS 2.1 are ultra performance mode for 8K, VR support, and dynamic resolution. I can't tell the difference between DSR off or on when DLSS 2.1 quality mode is engaged, like in the video. I haven't actually seen anyone showcase VR support and dynamic resolution on DLSS 2.1.

Link to comment
Share on other sites

  • 3 weeks later...

The alleged and never announced 16/20 GB versions of the 3070/80 have equally allegedly been cancelled. Cue lots of wags asking if the 10GB version of the 3080 is still due to be released sometime or not...

The rumour of the cancellation of a rumoured product is mostly interesting because of the other rumour floating around, that nVidia wants to do a TSMC based refresh of 3000 series already. Not that extra memory made much sense anyway except to increase thermals even more unless you were a content creator type, in which case nVidia would probably prefer you to buy a 3090 or a pro card anyway.

  • Hmmm 2
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...