Jump to content

Recommended Posts

Posted (edited)
5 hours ago, LadyCrimson said:

Not news, but I found it hilarious watching him run around manhandling all the displayed gpu's, commenting on physical appearance/designs. Kept thinking "don't drop them!"
And ... even as a 2-slot, 50xx gen is ofc huge - but the aftermarket ones even moreso. If I were to get a 5090, I'd want the FE if I could still get it when I wanted to buy - wouldn't care re: any OC and not keen on bigger size and bigger price, lol.

 



You know, in other branches of electronics, devices becoming SMALLER is actually a sign of progress. Not the vice versa... 😄

Hercules Graphics Card - Wikipedia

Really not sure if "Ever more PS" and brute forcing against clearly physical limitations is the solution here for all eternity, in general. Gotta be a reason why even cards that have the power of 2016 entry level GPUs can't be sold for anything less than 100$+ Dollars too. That's as if back in 2006 a 3dfx Voodoo1 was still on display for actual money -- 2006 was the year of TES Horse Armor DLC, Neverwinter Nights 2 and GeForce 7 series, just in case nobody remembers.

Edited by Sven_
Posted

I loved my Voodoo 3 3000. Just saying. :lol:

Also, (very) slightly more on topic, Steve's sarcasm/shaking of head at absurd marketing ad terms cracked me up even more than that other video. Quality entertainment.
 

 

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted
1 hour ago, Gfted1 said:

Thats how you get free medical and education! :p  I thought you were exaggerating so I Googled it up, and holy moly, MSRP is $1,999: Nvidia announces RTX 5090 for $1,999, 5070 for $549 — plus AI, DLSS 4, and more | Tom's Hardware. Toss in another 21% VAT (?) and things escalate quickly.

The last few years MSRP has been a fiction though. Those are founders models, not enough of those to affect the price of the partner models, and those are likely going to be heavily scalped as well.

Posted (edited)

I would also say that the 50 series could be just like the 20 series in that it boasts a lot of fancy features that are yet to be fully realized. Jensen wants us to join the AI revolution, just like he wanted us to join the ray tracing revolution. That didn't turn out to be that fantastic, at least not for a long while. If you want to run LLMs locally, just get an AI PCI card.

Edited by Also gorgon
Posted

Something I've noticed is how many netizens now think the 5080 is "middle" with xx90 being "high." It's like some have forgotten that 5080/ti was seen as the top regular consumer end and anything more was seen as "tech pro" or whatever, not worth buying (vs. price) for gaming. In other words, outside of tech specs, even the naming kind of separated them in consumers minds. "Titan" and whatever else they had.  Wasn't the top "Titan" in 2014 $3k or something? >.>  Does nvidia still make any sort of consumer/pro "Titan" or other named version, vs. 50xx label, gpu?

Naming obfuscation/merging/confusion is a great marketing tool.

 

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

The professional line was called Quadro until the name was dropped with the advent of the RTX cards. Nowadays they have model names like RTX 4000 or RTX 5880. The key difference is ECC memory and access to nVidia's special (and certified) drivers with higher precision. The old Titan cards were simply aimed at enthusiasts who didn't mind shelling out a lot for high end hardware. That arguably changed, because the RTX 4090/5090 are clearly not just meant as halo products for gaming, but for those professionals who need them for raw computing power, memory capacity and bandwidth, but don't need ECC memory or increased computational precision.

And, well, yes, they also basically shifted their product stack along with the new target, because like I said before, in addition to increasing the gross margins, cutting down bus size from the XX80 cards downwards makes sure those professionals have no other option than the XX90ies.

No mind to think. No will to break. No voice to cry suffering.

Posted
5 hours ago, Also gorgon said:

The last few years MSRP has been a fiction though.

After reading your reply, and rereading my original post, I just want to clear the air by saying that I was agreeing with you. I know MSRP is just window dressing, and you already mentioned regional pricing, but I was SHOCKED that the card starts at $2k! Ive built entire computers for less...hell, my first car was less! *shuffles off slowly old man style*

  • Thanks 1
Posted

@majestic I vaguely remember hearing the name Quadro before. So basically, Titan/xx90 has always been aimed at mega tech/game nerd/enthusiasts with money to burn, but it was even a smaller fraction of people who bought them.

I paid $1400+ for partner evga 2080ti six years ago. B4 that it was $800 for 980ti.  I figure such in 2025 would be more like $1999, if nvidia even makes a xx80ti. And it still might not come with more than 16 vram. Which is what might push ppl like me towards the xx90. But good lord, I could probably find an old beater car, fix it up a tad, and it'd last me longer. Yeah, Diablo1 forever sounds better all the time.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted (edited)

Average video game VRAM usage is determined by console generations more than anything else. Now that the current generation is widely adopted, we can see more and more games needing the extra memory these consoles have (i.e. up to 12GB RAM for the frame buffer, instead of 8GB or less, outside of the irrelevant XBOX Series S).

Right now we have games like Black Myth Wukong that don't need 8GB of VRAM even on 1440p unless you turn on ray tracing, which kills AMD performance so bad the 7900 XTX lands behind the RTX 4070, even with having twice the memory and a much higher memory bandwidth, and we can already see the next tech shift in games to include ever increasing RT workloads that sometimes cannot even be turned off, like in Avatar: Frontiers of Pandora - which is an AMD sponsored title where they get outclassed by nVidia - that even break the limits of what an RTX 4090 reasonably can handle, it looks more like even these overpriced nVidia cards will be technologically outdated before games reasonably can be expected to break the 16GB VRAM barrier.

Look at all those RDNA2 cards that were recommended by tech channels because they had a better price to performance ratio and a more futureproof amount of memory on them compared to Ampere. A 6900 XT does 49fps on average with maximum details, upscaling and without ray tracing in Black Myth Wukong... on 1080p, coming in behind the 3080 10GB, and even with a 40 series card you'll be hard pressed to get decent performance out of turning full ray tracing on without using frame generation at any resolution above 1080p.

Yes, Black Myth Wukong is an edge case due to being sponsored by nVidia and therefore using a lot of nVidia code, but UE5 in general shot future proofing from the last GPU generation to the moon.

The next console generation is a while away, and widespread adoption to the point developers can discontinue to keep the prior generation in mind even further. My point being that spending twice as much for an RTX 5090 to have more than 16GB VRAM because it might last you longer is a risky proposition. You might find yourself needing more than 16GB of memory at some point, but unable to run the games in your desired resolution anyway. Even more so if that resolution is 4k or maybe even more.

Edit: Not that I disagree with the general consensus that it is ridiculous for nVidia to not make 16GB VRAM their baseline on the 5060 and increase it from there, but going to the 5090 just for the extra memory with only gaming is so not going to be worth it. For that it would need to cost less than double of a 5080. Much less.

Edited by majestic

No mind to think. No will to break. No voice to cry suffering.

Posted
7 minutes ago, majestic said:

Yes, Black Myth Wukong is an edge case due to being sponsored by nVidia and therefore using a lot of nVidia code, but UE5 in general shot future proofing from the last GPU generation to the moon.

Just reduce the ludicrous default tesselation settings and you fix AMD performance.

(This comment brought to you by the phrase plus ça change)

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...