Jump to content

Recommended Posts

Posted (edited)

^ I should note that the whole reason I went to higher end gpu's was because of longevity re: higher game settings.
If you buy high end nvidia it's typical that a couple generation series later, the "lower end" ends up being kinda like what used to be higher end 2-3 gens ago.

But I really don't want to give nvida 2k+ for a stupid gpu. $1400 for the 2080ti 5 years ago was bad enough, even tho I've liked that gpu/it's served me very well (I've only played 4k since I bought it).

tripleEdit: it's not at all about affordability for me. It's consumer principle at this point + less gaming interest. I just want to play on a giant 4k screen. >.>

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

Games have different failure modes for running out of VRAM. Some may at least gracefully dial back texture quality dynamically, which sometimes might not be noticeable. But others aren't nearly as well-prepared for it and become a stutter-fest.

Funny thing about the 4070 Ti in the first place is that I think it's a card they're struggling to sell. It's one of the two older models in the family, the other being the original 4070 GDDR6X version. The Super variants were released later, and the 4070 Super ended up being a 5-10% slower Ti, but with the same VRAM and for around 20% less money. Mind you it's the same again with the Ti Super, 5-10% faster than the Ti for 20% more money, though at least you also get 4GB extra VRAM this time, unlike going from a Super to a Ti.

At any rate, I don't think it's a card you should fear missing out on, mainly because it's hardly a paragon of value in the first place. The 5070 will have it covered easily, and even with AMD abandoning the high-end, the 7900 XT is already faster than it for the same price, so you can infer that their replacement for that card will also be a suitable alternative.

 

(Note that prices are based on what I've seen locally, so the pricing gaps might not be representative. The Super is around $900AUD here, then add $200 AUD for each successive step up)

L I E S T R O N G
L I V E W R O N G

Posted
17 hours ago, Humanoid said:

The same thing has already happened multiple times in recent memory, so it's nothing new really. Is it really any different to just 5 years ago with the 5700 XT, a decade ago with the RX 480, and going further back, the 3870? Doesn't feel like it. The mainstream gamer will be find and well catered for by whichever products they'll likely release.

They try to create a halo product that matches nVidia's best when they can, but if they're way behind in a particular generation, they don't bother. Yeah, it hurts the Radeon brand in terms of it increasingly being viewed as just a value product, but it's not like it has much cachet right now even with the existence of the 7900 XTX.

It's their timing and pricing that ****ed them over more than anything. They barely undercut nvidias cards and had less features, so why not spend the extra 100$ and get good ray tracing performance, DLSS and their other features. And they were nowhere to be seen when nvidia released their 30-series cards, since there was nothing to go on, I bought an 600$ 3080, waited for several months for the delivery, and shortly before I took delivery of my card AMD released their products.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted

The spec difference between the 5090 and 5080 is bizarre. kopite7kimi is usually right though, so I guess that is what we're getting.

No mind to think. No will to break. No voice to cry suffering.

Posted
13 hours ago, Bokishi said:

Since 4090 and 4080 were 1600 and 1200 at launch (I think), I'd guess $1999 and $1499 this time.

8 hours ago, majestic said:

The spec difference between the 5090 and 5080 is bizarre. kopite7kimi is usually right though, so I guess that is what we're getting.

Yeah...can't even give the 5080 20gb vram. I suppose they'll give that to the 5080ti or 5080ti super or whatever they're planning on squeezing inbetween as usual.
That's what I mean by gimping everything but the xx90's, but not making the xx90's twice the price. Thus driving people to buy the xx90 "for a little more."

I figure at some point Nvidia will only put out xx90's for $4k for the elites and everyone else will buy mid tier Intel or AMD for $800-$1200. Pfft. 😛

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

Lower and mid tier GPUs might as well stop existing in the (not so far off) future as integrated graphics become more and more powerful. If you stick to 1080p at slightly lower settings, you can already make do with Lunar Lake or Strix Point, and those are laptop CPUs.

Well, unless Intel walks back on their ARC support and kills the rest of the team in the wake of their layoffs.

No mind to think. No will to break. No voice to cry suffering.

Posted (edited)


 

2 hours ago, majestic said:

Lower and mid tier GPUs might as well stop existing in the (not so far off) future as integrated graphics become more and more powerful. If you stick to 1080p at slightly lower settings, you can already make do with Lunar Lake or Strix Point, and those are laptop CPUs.

 

Yeah, they're naturally priced accordingly. On the desktop front, things have been the same for a decade now give or take. Which is: The fastest desktop APU has about the rendering power of entry level gaming GPUs of ~5 years prior. Currently Ryzen 8700G roughly matching a 2019 GTX 1650. APUs have their place, but they're niche. For a budget build you're far better off with a Ryzen 5500 + RX 6600 -- which both combined cost about the same as the 8700G, whilst offering multiple times the gaming power due to the GPU.

 

Prior to the market exploding, I'd have changed my GPU pretty regularly. Now I'm probably gonna sit on my 1050ti until it rots. 😄 The only games I'd currently upgrade for are Stalker 2 + KCD II (and MAYBE Avowed) anyways. For everything else, it's mostly still... ok (not playing many gfx blockbusters anyway). However, I'm not gonna buy a 8GB VRAM GPU in 2025 -- the RX 480 already had 8GB in 2016 (and the GTX 1060 6 as well). A decade later, those aren't gonna last any much longer even in Full-HD. The only exception would be the smaller Intel Battlemage budget model, depending on which. However, a RTX 5060 / RX 8600 with 8GB? Go **** yerselves. 😄 I'd rather spend big and build an altogether new proper gaming PC than doing that -- which I haven't done in more than 20 years. PC gaming as such had been pretty inexpensive for a very long time -- even a GeForce 4 ti I could get a year after launch in 2003 for below 100 bucks, which was less than half of what it launched for ~15 months earlier.

Edited by Sven_
Posted
21 hours ago, Bokishi said:

2999$ I'm guessing, maybe something like that for nvidias own cards on release.

16Gb of memory is really on the low end, that is really depressing.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted
6 hours ago, Azdeus said:

2999$ I'm guessing, maybe something like that for nvidias own cards on release.

16Gb of memory is really on the low end, that is really depressing.

OK, and how much do you think the 5090 will go for?

12 hours ago, majestic said:

Well, unless Intel walks back on their ARC support and kills the rest of the team in the wake of their layoffs.

Dunno, kind of feel Intel has to persevere with the GPU division. Not for the GPUs themselves though.

  • Like 1
  • Haha 1
Posted (edited)
22 hours ago, Zoraptor said:

OK, and how much do you think the 5090 will go for?

Dunno, kind of feel Intel has to persevere with the GPU division. Not for the GPUs themselves though.

I'm thinking 4K, or there about, you know, for a 1$ per K resolution 😄

Edit; To be fair though, with those specs, that "5080" ought to be called 5070. They're doing it again, just not being as stupid about it.

Edited by Azdeus

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

  • 3 months later...
Posted (edited)

The current "leaks and rumors" have the 5090 being over 2k and the 5080 being not quite half that. So, say, $2200 and $1300. With the theory on such rumors being they learned their "lesson" and want the price difference to = more people buying 5080 vs. "it's only $300 more for the xx90/all that vram, so why not."   And the 5080Super/ti/whatever maybe would later have 24 for $400 more than 5080 and be the "true" 5080 etc.  Edit: ofc it's all speculation/leaks. Nvidia could change their minds at last minute etc.

I am not convinced yet twice the VRAM would be worth almost twice the price (if that is indeed what happens). I mean in my case a jump from 2080ti to 5080 would still, I assume, be quite a leap, but man - does nvidia think vram is made of gold? Could game dev's just stop racing ahead in the direction of requiring so much vram just to run a game, even without full PT? Thanks.

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

Nvidia has to charge the prices that it charges...because they keep selling out at those prices. Usually the only way to even get your hands on new and actually good computer hardware releases within a reasonable timeframe (i.e. within the first month or two) these days is by going in person to a Microcenter, because all the normal online options are usually sold out and scalped out the wazoo. The 9800X3D is still not available even two months later after release, and who knows when that will change?

  • Like 1
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Posted
56 minutes ago, LadyCrimson said:

I am not convinced yet twice the VRAM would be worth almost twice the price (if that is indeed what happens).

It is, sort of, but not for gamers. There is a group of people who do not need these cards for gaming or for mining, but for their computing power combined with memory bandwidth. That made the RTX 3080 and RTX 3080 Ti a very attractive alternative to nVidia's professional cards. Plenty of use cases where you do not need certified drivers that calculate values more accurately than what is good enough for gaming. nVidia decided that professional users who want to forgo their even more expensive professional cards should at least pay flagship model prices.

So they made the RTX 4080 with a smaller 256 bit memory bus than the RTX 3080's 320 bits (and 384 bits for the TI veriant), and increased its L2 cache to compensate. Thus the card was actually slower in some workstation applications than its direct predecessor, while still being faster for gaming. It's a win-win situation, you force professionals to use your halo products and the lower tier cards are available for purchase because nobody but gamers are gobbling them up.

22 minutes ago, Bartimaeus said:

Nvidia has to charge the prices that it charges...because they keep selling out at those prices.

Indeed. nVidia can charge whatever they want for their halo models because they're only interesting to a group of people who will pay the money regardless. Those 5090s would sell like hotcakes even if they charged 2999$ for them, because that would still be cheaper than getting dedicated AI processors and they are available. There's a reason the US government banned the export of the regular 4090 to China, and it's not because they want gamers to have GPUs. :p

No mind to think. No will to break. No voice to cry suffering.

Posted

Oh, yeah, I know there are non-gaming upsides/needs. And in the past nvidia's "top" was meant for professional use vs. consumer, (Titan, whatever other names) with them seeming to want to return to that mindset/pricing, perhaps - it's just become out of whack in that regard in gamer's minds - and I am, of course, looking at all of it from the gaming perspective. 😛

The main irritation for many lately mostly seems to be about the vram issue tho.  I would be fine with much less "powerful" performance in other areas of 5080, but AA/AAA dev's seem to have decided doing graphics that want huge vram #'s at least available, for gaming, should be a thing. And it's probably only going to get worse (even discounting full RT) So mid to highish end, gamers want/think they should get more vram at minimum, even if it isn't the uber fastest vram or highest bandwidth or whatever other tech specifics.  Is it dev's fault, or the gpu makers fault, chicken, egg. There will always be gamers who buy every gpu generation because they have nothing better to do, just like some people buy/lease constant new cars, but most of us - even most of us with plenty of cash to fling around on an optional hobby - want a bit more justification.

If they just stuck 20gb vram on the 5080 I probably wouldn't complain. Just me tho.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
  • 2 weeks later...
Posted (edited)

Nvidia: "5070 is as good as a 4090!" .... "with the new DLSS 4/super-AI frame-gen4x all on vs. the old 40xx versions of such."    :getlost:
From what I can tell (at this point/no actual benchmarks) the best thing about 50xx will be a big leap up for DLSS quality, and that the 5080 is a 2-slotted card instead of 3. eg, smaller. Wonder how that'll do re: gpu-cooling.
DLSS 4 (not frame gen aspect) will also supposedly be ported backward to even 20xx, although ofc gains from it on older gpu's won't be as impressive.

The MSRP pricing leaves plenty of room for a 5080ti or some such later - if it has 20+ vram I'd wait/buy that. Although 3rd party versions I'm sure will be 25% more than MSRP. But it all still makes me hope I can wait for 60xx. If I can fix my reboot issue. Pfft.

Digital Foundry did get to use test-demo period of the 5080 for some hours or something to try some settings/record a little video/performance comparison graphs (not serious benchmarks, mind), but that's where the possible DLSS/frame gen uplift - less artifacts/smear/light angle/distance jitter/blur etc - info is from. And release versions could be different from what they got to look at.

 

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

Remember back when a new graphics card was about getting better quality graphics? Pepperidge farm remembers

So goddamn tired of the upscaling bull****. It's smeary, it artifacts and quite frankly should be considered false advertising. I didn't get a better display to run **** in upscaled 1080p :down:

The VRAM on the news cards coming is probably going to shorten the lifespan on those cards aswell. I should've bought a goddamn 7900xtx...

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted

Frame generation is hilarious to me: you get fake frames that make the visuals seem smoother but actually introduce input lag and make the game noticeably feel and perform worse despite the artificial smoothness. It's brilliant, except that it's total garbage, but hey, at least a bunch of dummies get fooled by it and think their games are better with than without it.

Perhaps even more comical is that as the higher visual fidelity that 3D graphics are able to achieve, it inversely becomes easier and easier to notice that animations simply aren't able to keep up. Nothing like pumping all the pixels and ray tracing in the world into models, textures, and lighting, only to be immediately let down by forever wonky animation work that make the whole effort seem hardly worth the bother.

  • Like 1
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Posted (edited)

Using dlss and related for benchmarks claims is also another way to game benchmarks. 2x the performance (due to dlss) can also be achieved by changing how the dlss quality categories are defined.

Less VRAM is great for built in obsolescence though, and lets Jensen add extensively to his spatula collection.

Plus while they may have more VRAM a 7000 series radeon just plain won't be able to run the next version of FSR, so it's not like it's roses on team red either.

Edited by Zoraptor
Posted (edited)
1 hour ago, Zoraptor said:

Jensen add extensively to his spatula ugly shiny jacket collection.

Fixed that for you.
Honestly I think Jensen needs to hire an announcement spokesperson, because Jensen is absolutely terrible at it.  When on stage it feels like he doesn't want to be there and his comments are only  negative meme worthy.

I don't mind a bit of DLSS, for 4k at least, so less artifacting with that is fine, but frame generation is LOL and using any upscaling as a benchmark brag is ROFL insulting.
I think the true problem is the whole gpu diminishing returns+yet no radically new tech (whether because not invented yet or because of, y'know, maybe re: engineering physics), but still needing something to convince people to buy a new gpu every 2-4 years. It's like adding touch screens and netflix screens and cute music jingles to home appliances at this point.

Edit: you can't keep moving upwards at a lightning pace forever and both companies/developers and consumers perhaps need to stop expecting/selling it.

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

If I can spot the 5070ti for under 1k, I may very well get it. I am currently a sad potato playing Pacific Drive with the 970.

"because they filled mommy with enough mythic power to become a demi-god" - KP

Posted (edited)
11 hours ago, Azdeus said:

Remember back when a new graphics card was about getting better quality graphics? Pepperidge farm remembers

So goddamn tired of the upscaling bull****. It's smeary, it artifacts and quite frankly should be considered false advertising. I didn't get a better display to run **** in upscaled 1080p :down:


I used to be the same. But considering that GPU manufacturing seems to be hitting a wall physically, I' couldn't care less at this point. I'm not interested in even my usual entry level graphics card being as large as a battleship, costing as much as a PC used to cost, drawing like 200W+ and heating the room up in summer, just for playing a bloody video game. Personally I've never been about pixel perfection though, e.g. never obsessed about less than perfect textures or what Arkane's founder calls "making sure the eyes are perfect and the sun shines the right way." Unless it's something truly instrusive going on, during gameplay, you eventually don't notice. Nor care. Your mileage may differ.

Let's see how this actually turns out and develops. Any technology that eventually may help to get out of this race with ever diminishing returns is a win in my book though. And if there's been a component market that's been SCREAMING diminishing returns, it's GPUs. Sure, GPUs have become infinitely more complex, with gazillions of Transistors now running the show. Still funny how it's been x86 CPUs that's been declared dead for decades, when a Ryzen can be bought for the exact same price tag as an Athlon 64 20 years ago. In fact, despite popular demand, the 9800X3d is cheaper than the FX-55 ever used to be. 

Edited by Sven_
Posted

Tru facts ☝

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted

Not news, but I found it hilarious watching him run around manhandling all the displayed gpu's, commenting on physical appearance/designs. Kept thinking "don't drop them!"
And ... even as a 2-slot, 50xx gen is ofc huge - but the aftermarket ones even moreso. If I were to get a 5090, I'd want the FE if I could still get it when I wanted to buy - wouldn't care re: any OC and not keen on bigger size and bigger price, lol.

 

  • Like 1
  • Thanks 1
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...