Sven_ Posted January 9 Posted January 9 (edited) 5 hours ago, LadyCrimson said: Not news, but I found it hilarious watching him run around manhandling all the displayed gpu's, commenting on physical appearance/designs. Kept thinking "don't drop them!" And ... even as a 2-slot, 50xx gen is ofc huge - but the aftermarket ones even moreso. If I were to get a 5090, I'd want the FE if I could still get it when I wanted to buy - wouldn't care re: any OC and not keen on bigger size and bigger price, lol. You know, in other branches of electronics, devices becoming SMALLER is actually a sign of progress. Not the vice versa... Hercules Graphics Card - Wikipedia Really not sure if "Ever more PS" and brute forcing against clearly physical limitations is the solution here for all eternity, in general. Gotta be a reason why even cards that have the power of 2016 entry level GPUs can't be sold for anything less than 100$+ Dollars too. That's as if back in 2006 a 3dfx Voodoo1 was still on display for actual money -- 2006 was the year of TES Horse Armor DLC, Neverwinter Nights 2 and GeForce 7 series, just in case nobody remembers. Edited January 9 by Sven_
LadyCrimson Posted January 12 Posted January 12 I loved my Voodoo 3 3000. Just saying. Also, (very) slightly more on topic, Steve's sarcasm/shaking of head at absurd marketing ad terms cracked me up even more than that other video. Quality entertainment. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Also gorgon Posted January 12 Posted January 12 So with VAT and regional pricing a 5090 would run me about 3700 usd. yeah, no.
Gfted1 Posted January 13 Posted January 13 Thats how you get free medical and education! I thought you were exaggerating so I Googled it up, and holy moly, MSRP is $1,999: Nvidia announces RTX 5090 for $1,999, 5070 for $549 — plus AI, DLSS 4, and more | Tom's Hardware. Toss in another 21% VAT (?) and things escalate quickly. 1 "I'm your biggest fan, Ill follow you until you love me, Papa"
Also gorgon Posted January 13 Posted January 13 1 hour ago, Gfted1 said: Thats how you get free medical and education! I thought you were exaggerating so I Googled it up, and holy moly, MSRP is $1,999: Nvidia announces RTX 5090 for $1,999, 5070 for $549 — plus AI, DLSS 4, and more | Tom's Hardware. Toss in another 21% VAT (?) and things escalate quickly. The last few years MSRP has been a fiction though. Those are founders models, not enough of those to affect the price of the partner models, and those are likely going to be heavily scalped as well.
Also gorgon Posted January 13 Posted January 13 (edited) I would also say that the 50 series could be just like the 20 series in that it boasts a lot of fancy features that are yet to be fully realized. Jensen wants us to join the AI revolution, just like he wanted us to join the ray tracing revolution. That didn't turn out to be that fantastic, at least not for a long while. If you want to run LLMs locally, just get an AI PCI card. Edited January 13 by Also gorgon
LadyCrimson Posted January 13 Posted January 13 Something I've noticed is how many netizens now think the 5080 is "middle" with xx90 being "high." It's like some have forgotten that 5080/ti was seen as the top regular consumer end and anything more was seen as "tech pro" or whatever, not worth buying (vs. price) for gaming. In other words, outside of tech specs, even the naming kind of separated them in consumers minds. "Titan" and whatever else they had. Wasn't the top "Titan" in 2014 $3k or something? >.> Does nvidia still make any sort of consumer/pro "Titan" or other named version, vs. 50xx label, gpu? Naming obfuscation/merging/confusion is a great marketing tool. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
majestic Posted January 13 Posted January 13 The professional line was called Quadro until the name was dropped with the advent of the RTX cards. Nowadays they have model names like RTX 4000 or RTX 5880. The key difference is ECC memory and access to nVidia's special (and certified) drivers with higher precision. The old Titan cards were simply aimed at enthusiasts who didn't mind shelling out a lot for high end hardware. That arguably changed, because the RTX 4090/5090 are clearly not just meant as halo products for gaming, but for those professionals who need them for raw computing power, memory capacity and bandwidth, but don't need ECC memory or increased computational precision. And, well, yes, they also basically shifted their product stack along with the new target, because like I said before, in addition to increasing the gross margins, cutting down bus size from the XX80 cards downwards makes sure those professionals have no other option than the XX90ies. No mind to think. No will to break. No voice to cry suffering.
Gfted1 Posted January 13 Posted January 13 5 hours ago, Also gorgon said: The last few years MSRP has been a fiction though. After reading your reply, and rereading my original post, I just want to clear the air by saying that I was agreeing with you. I know MSRP is just window dressing, and you already mentioned regional pricing, but I was SHOCKED that the card starts at $2k! Ive built entire computers for less...hell, my first car was less! *shuffles off slowly old man style* 1 "I'm your biggest fan, Ill follow you until you love me, Papa"
LadyCrimson Posted January 14 Posted January 14 @majestic I vaguely remember hearing the name Quadro before. So basically, Titan/xx90 has always been aimed at mega tech/game nerd/enthusiasts with money to burn, but it was even a smaller fraction of people who bought them. I paid $1400+ for partner evga 2080ti six years ago. B4 that it was $800 for 980ti. I figure such in 2025 would be more like $1999, if nvidia even makes a xx80ti. And it still might not come with more than 16 vram. Which is what might push ppl like me towards the xx90. But good lord, I could probably find an old beater car, fix it up a tad, and it'd last me longer. Yeah, Diablo1 forever sounds better all the time. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
majestic Posted January 14 Posted January 14 (edited) Average video game VRAM usage is determined by console generations more than anything else. Now that the current generation is widely adopted, we can see more and more games needing the extra memory these consoles have (i.e. up to 12GB RAM for the frame buffer, instead of 8GB or less, outside of the irrelevant XBOX Series S). Right now we have games like Black Myth Wukong that don't need 8GB of VRAM even on 1440p unless you turn on ray tracing, which kills AMD performance so bad the 7900 XTX lands behind the RTX 4070, even with having twice the memory and a much higher memory bandwidth, and we can already see the next tech shift in games to include ever increasing RT workloads that sometimes cannot even be turned off, like in Avatar: Frontiers of Pandora - which is an AMD sponsored title where they get outclassed by nVidia - that even break the limits of what an RTX 4090 reasonably can handle, it looks more like even these overpriced nVidia cards will be technologically outdated before games reasonably can be expected to break the 16GB VRAM barrier. Look at all those RDNA2 cards that were recommended by tech channels because they had a better price to performance ratio and a more futureproof amount of memory on them compared to Ampere. A 6900 XT does 49fps on average with maximum details, upscaling and without ray tracing in Black Myth Wukong... on 1080p, coming in behind the 3080 10GB, and even with a 40 series card you'll be hard pressed to get decent performance out of turning full ray tracing on without using frame generation at any resolution above 1080p. Yes, Black Myth Wukong is an edge case due to being sponsored by nVidia and therefore using a lot of nVidia code, but UE5 in general shot future proofing from the last GPU generation to the moon. The next console generation is a while away, and widespread adoption to the point developers can discontinue to keep the prior generation in mind even further. My point being that spending twice as much for an RTX 5090 to have more than 16GB VRAM because it might last you longer is a risky proposition. You might find yourself needing more than 16GB of memory at some point, but unable to run the games in your desired resolution anyway. Even more so if that resolution is 4k or maybe even more. Edit: Not that I disagree with the general consensus that it is ridiculous for nVidia to not make 16GB VRAM their baseline on the 5060 and increase it from there, but going to the 5090 just for the extra memory with only gaming is so not going to be worth it. For that it would need to cost less than double of a 5080. Much less. Edited January 14 by majestic No mind to think. No will to break. No voice to cry suffering.
Zoraptor Posted January 14 Posted January 14 7 minutes ago, majestic said: Yes, Black Myth Wukong is an edge case due to being sponsored by nVidia and therefore using a lot of nVidia code, but UE5 in general shot future proofing from the last GPU generation to the moon. Just reduce the ludicrous default tesselation settings and you fix AMD performance. (This comment brought to you by the phrase plus ça change)
LadyCrimson Posted January 24 Posted January 24 20-50% 4k rasterization vs. 4090 (depends greatly on title, ofc). GPU thermals decent (72ish C). Memory thermals higher than they'd like to see (near 90 c) Even at the price the performance increase I'd get might be worth it because of where I'd be upgrading from (2080ti), but I'd still like to wait one more. Maybe a 6080 will come with 24 vram and I won't feel like I "need" a xx90. I can dream. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
majestic Posted January 26 Posted January 26 The RTX5090's 0.1% lows being higher than the average framerate of the RTX 4090 at 4k is pretty impressive. Probably a result of the massively increased memory bandwidth. The other side of the medal is that the card is "only" 30% faster than the RTX 4090, and that only in 4k. Although it is unlikely that anyone would buy the 5090 for 1080p or 1440p gaming, GN's chart show that even the 4090 scratches the limit of what even a 9800X3D CPU can do. The 30% improvement is also the amount of CUDA-cores the 5090 has more than the 4090. While that doesn't scale 1:1, it still shows that the generational improvement is probably not that good, but we'll see once the actual RTX 5080 benchmarks hit (leaked geekbench results showing a 22% gap nonwithstanding). Doesn't really matter either way, neither buying the 5090 nor the 5080. No mind to think. No will to break. No voice to cry suffering.
Sven_ Posted Sunday at 08:25 PM Posted Sunday at 08:25 PM (edited) I'm gonna wait 'til RTX 5090 performance* can be had for 300 bucks. Which means roughly March 2041... progress, baby! * without frame generation! Edited Sunday at 08:26 PM by Sven_
Zoraptor Posted Monday at 05:41 AM Posted Monday at 05:41 AM 20 hours ago, majestic said: The other side of the medal is that the card is "only" 30% faster than the RTX 4090, and that only in 4k. It's pretty amazing how consistent that 25-30% is. 25-30% more shaders, 25-30% more die space, 25-30% more wattage, 25-30% more performance. The only thing that is outside that is (as noted) the +70% theoretical on the memory. That very much looks like a near zero improvement on most like to like bases. 1
majestic Posted Wednesday at 05:44 PM Posted Wednesday at 05:44 PM 7-12% improvement. Better efficiency. At least it still has the RTX 4080 Super MSRP. Steve sums it up pretty well when he says: boring. No mind to think. No will to break. No voice to cry suffering.
LadyCrimson Posted Thursday at 11:34 PM Posted Thursday at 11:34 PM The current spate of videos seems to indicate that the 5080 is a much better overclocker than the previous xx80. eg, get a FE 5080 and OC the heck out of it is how to get a "real" 5080. Well, at least a fair bit better 5080 without the extra giant partner markup. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Zoraptor Posted yesterday at 12:50 AM Posted yesterday at 12:50 AM Prices for the 5090 start at $5500 here (about 3.5k USD, though it includes 15% GST). Three times what I paid for my entire computer back in 2017, to put it in perspective and roughly three times the cost for a near top of the line 1080Ti. 5080 starts at ~2800NZD or about 2x a baseline 1080Ti. absolute lolz.
LadyCrimson Posted yesterday at 01:45 AM Posted yesterday at 01:45 AM I think my current rig I spent around $2800k. Not including the TV later. I don't remember tho. All I remember is that evga ftw price. And no surprise, despite the prices, 5090 totally out of (the tiny initial) stock everywhere far as I can tell, FE and partners both. I can't wait to be in the market for a MSRP $2999k 6090 and $1999 6080. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
LadyCrimson Posted yesterday at 02:30 AM Posted yesterday at 02:30 AM lol, supposedly Microcenters had 4 5090's per store, nationwide. HAHA “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Bartimaeus Posted yesterday at 09:57 AM Posted yesterday at 09:57 AM (edited) Not directed at anyone here: The constant refrain of people saying "I hope AMD/Intel are able to compete with Nvidia because Nvidia is the worst" is truly hilarious when it's also always paired with those exact same people refusing to ever buy anything except for Nvidia. Neither AMD or Intel are going to compete at the top-end any time soon, so either A. get used to not owning the top-end GPUs or B. get used to waiting months for the honor of being able to pay thousands of dollars to Nvidia for a GPU that's hardly better than the previous gen. And here come the tariffs... Edited yesterday at 09:59 AM by Bartimaeus 1 Quote How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart. In my dreams, I am not crippled. In my dreams, I dance.
Also gorgon Posted yesterday at 03:54 PM Posted yesterday at 03:54 PM the local price comparison site that I use has no 5090 in stock, since the 'launch' their 4090s are priced almost identically to their non existent 5090s, about a 20% increase since last month. deepseek doesn't even need cuda.
Also gorgon Posted yesterday at 04:00 PM Posted yesterday at 04:00 PM I should have 'scalped' as many 4090s as i could afford. I would have made a killing.
LadyCrimson Posted 11 hours ago Posted 11 hours ago I've always thought being hyped for a mega improvement every generation was a little odd, and good marketing. I mean, most of us don't upgrade every 2 years, right? At least not once majority of games didn't really require it. Wait 6+ years and a current gen is gonna be a 250%+ uplift from what you have, which isn't terrible (still too expensive tho). The era of mega gains every two years is probably over. I suppose it's not much different than smart phone companies trying to convince (rather successfully, it seems) consumers that they must get a new phone every couple years. Or trying to convince us we need a new TV, fridge, stove, every few years. Assuming current tech is reaching some sort of actual tech/physics plateau, gpu's are basically going to be selling us software, not hardware. So to speak. I think phones are going to run into similar if they haven't already. How many "4-6 lens"/megapixel camera upgrades or resolution hikes for 7" screens, do we need, to get ppl to buy the new version. It's now mostly software/planned obsolescence etc. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now