Jump to content

Nvidia RTX Series


Bokishi

Recommended Posts

Displayport 1.4 rather than 2?

Guess it's kind of a PCIe 4 vs 5 situation with the new spec being kind of pointless, now, but...

14 hours ago, LadyCrimson said:

Hasn't  Nvidia been branching out from (retail) consumer home PC gpu stuff for a while so much that the other stuff is taking up more of their attention/care in terms of making all of that grow massively. The data center, SOC, AI, or whatever else (I just occasionally see things mentioned in articles, not an nvidia expert). Not saying graphics aren't still their big thing, I just mean re: retail consumer cards it feels like they try to do just enough  to be  "relevant"/ahead/and in the news - so they can always be #1 or at least #2 - and not much else.

They've actually made a lot of noise about being a 'software company' now and targeting $40bn of new revenue from that*. They make huge margins per card and a lot of money in absolute terms in the server/ AI/ pro etc sector already and have for a long time, albeit those cards are also somewhat more expensive to make.

Annualised licensing is not really the sort of thing they can apply to gamers though, since many don't even like the relatively mild gatekeeping on the drivers; corporations and the like pretty much expect it though. The reactions if they tried charging gamers for driver updates would certainly be, uh, interesting to say the least.

*Probably wishful thinking with the crypto boom ending.

Edited by Zoraptor
Link to comment
Share on other sites

Is DisplayPort 2.0 even out yet? I feel like it's been pushed back 3 or 4 times.

But yeah, I don't really see a scenario where DisplayPort 2.0 would even matter right now. I can't imagine DisplayPort 1.4a or HDMI 2.1 not being enough, even for a rig at the level of @Bokishi

Edited by Keyrock
  • Thanks 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Dunno, I saw a lot of people who thought Lovelace would definitely have DP2.0 if only to beat AMD to it*. Suppose they would have had to redesign the I/O to do it, so they didn't. And while I don't imagine too many people will be running 4k/240hz any time soon capability does make for a nice marketing claim.

*eg "If AMD is adding DP 2.0 then indeed, Nvidia’s Ada Lovelace will have this feature too."

 

Link to comment
Share on other sites

7 minutes ago, Zoraptor said:

Dunno, I saw a lot of people who thought Lovelace would definitely have DP2.0 if only to beat AMD to it*. 

Pornhub already beat both Nvidia and AMD to DP 2.0.:shifty:

I'll show myself out.

  • Haha 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

8 hours ago, LadyCrimson said:

 

But yeah...I'm not sure how much I want to deal with all the new game-desktop stuff anymore. It just doesn't feel worth it. Not just the money, but all the other considerations too. 

I feel the same.  Therefore, I decided to apply the Icarus philosophy.  Aim too low (consoles, APU's, etc.) and the water will clip your wings, aim too high (4090's, 13900k's) and the sun will burn the fun out of it.  Which is precisely what happened last time I went full bore and got a 3090, the games weren't even 'fun' at that point, they just felt like glorified tech demos.  Now with this 40** and DLSS 3 crap....yuck.  Medium(ish, give or take) settings at 80-100fps and 1080p works for me

It is kind of annoying how developers keep pushing the latest graphics features on previously released games though.  Capcom for instance just released a mandatory patch for directx12 with every Resident Evil game released since 7, which absolutely destroyed performance on directx11 because they literally had to reprogram the the game for it, at the expense of performance.  I think there might be a way to reverse it, but still, a little heavy handed with pushing new features forward.

Edited by ComradeYellow
Link to comment
Share on other sites

ROFL.  This was worth the watch just to see the guy lose it/mirth-wise at the silly names and marketing.
If things keep going this way, I foresee desktop GPU's ending up being housed in their own separate "gpu mobo"/side-case that plugs to the main pc-case/mobo.  >.>
 

 

  • Like 2
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

6 hours ago, LadyCrimson said:

If things keep going this way, I foresee desktop GPU's ending up being housed in their own separate "gpu mobo"/side-case that plugs to the main pc-case/mobo.  >.>

Just skip ahead and mandate all new house builds have their own air conditioned, sound-proofed "server room" to house their PC separately from where the user actually sits.

  

On 9/22/2022 at 2:02 AM, Bokishi said:

Yeah I've known about these 4090 designs since July

So it's official then, Bokishi knows more about the 4000 series than EVGA does.

Edited by Humanoid
  • Haha 1

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

So, anyone up for a MIDNIGHT KALEIDOSCOPE for the ultimate DARK POWER? Dunno, I used to be 201% COMMITTED to replacing my ZOTAC RTX 3060 with a 4070, but now I'm just BRUTALLY overwhelmed and probably need an ANTI-GRAVITY DARK OBELISK with some BIONIC SHARK FINS. Sorry, FANS.

Seriously, what?

Feel a bit like Palpatine when his luggage went missing.

 

Edited by majestic
  • Haha 1

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

Back in the early days of the GPU we had fairly sedate names and coolers but absolutely wild box art. Now that's been flipped around so much that for the RTX 3000 series you can't even read what card it is unless you squint - nVidia's standardised box design means the text that says "3070" or whatever is in white text on lime green, in tiny font, at the bottom corner of the box. It's insane.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

17 hours ago, Humanoid said:

Just skip ahead and mandate all new house builds have their own air conditioned, sound-proofed "server room" to house their PC separately from where the user actually sits.

Actually, that idea has merit. This house has lots a lot of hall closet. I could tear out the shelving and turn the one at the end into a "server room". Make a vent/fan to the roof in it, stick in shallow A/C unit on the wall, run net, monitor wires through wall holes, a USB hub on the wall near the floor (or something).

...hahaha. Yeah, no.

  • Like 1
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

You're getting issues with your computer <-> monitor data connection. If the HDMI cable gets too long, you'll introduce annoying latency. I speak from experience. ;(

@Keyrock says he can't see how DisplayPort 2.0 does even matter right now - that's probably one of those situation where it does. At least I would love to be able to pump more data, faster, from the PC to the TV in my living room. That would be awesome. (I am currently running a super thick, 8m, gold coated HDMI cable and while some games work fine at 1920px, other games get a noticeable input lag ... and playing at 4K resolution is simply impossible with anything)

That said, I would ****ing love a separate room just for the PC hardware. This could solve many issues. If it just wouldn't cause other issues in return...

Edited by Lexx
  • Thanks 1

"only when you no-life you can exist forever, because what does not live cannot die."

Link to comment
Share on other sites

  • 3 weeks later...
2 hours ago, Sarex said:

Great performance gains from the 40xx series, or at least from the 4090.

Indeed. Obligatory Gamers Nexus post:

 

Huh, dunno, maybe an upgrade later to a smaller version is back on the table, if the gains are similar.

Edit: had a bit of a chuckle about the temperature being stable and pretty good on the FE's cooling setup, and that's without DARK POWER SHARK FIN fans. :p

Edited by majestic
  • Like 1
  • Haha 1

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

Yeah, as a 4k-'r, those review benchmarks make me drool. Still no interest unless my current PC just ups and dies or something tho, for myself.

Will be very curious re: a 50xx/equivalent series/period tho, re: whether they manage to get the power use/transient spiking under control (or if it'll only get worse...). I would hate to get a 40xx period and then the very next one suddenly is much better in that regard. Heh.

 

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

Apparently, at least at the moment, because the FE does so well itself, there doesn't seem to be much reason to buy a partner version of the 4090. Especially for any inflated price point. Plus the partner versions are often physically even bigger. Although I'm sure there will be some who think the very minor benefits is still something they covet, as always.

EVGA seems to have made a good decision. 😛

  • Sad 1
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

At this rate, in a generation or two they're going to stop making computer chassis and you'll have dedicated computer rooms. Your GPU will come on a truck by itself and you will need a pallet jack or dolly to wheel it into the computer room to connect it to the mother board. The mobo will still be the same size as it is today, by the way, but the GPU will be 6' x 2' x 8" and weigh 230 lbs.:shifty:

  • Like 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

It looked like something out of a techy Onion-like article but nVidia really are withdrawing the crippled 12GB 4080 from their lineup, wow. Definitely the right thing to do, but it'd take a lot to make Jenson swallow his pride like that. It'll obviously return as a 4070-series card of some sort, but who know when.

  

1 hour ago, LadyCrimson said:

Apparently, at least at the moment, because the FE does so well itself, there doesn't seem to be much reason to buy a partner version of the 4090.

Many countries including Australia have zero allocation of FEs. Hell, I think we had about a dozen 3080 FEs, which were exclusive to one retailer who sold them by raffle. Essentially they only existed to prove that the $1139AUD MSRP was totally legit and not false advertising.

Edited by Humanoid
  • Like 1

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...