Jump to content

Build Thread 3.0


Amentep

Recommended Posts

Is the 5090 out yet?

 

 

I was fine with DLSS/FSR/whatever being available as a way to extend life of older gpu's/allow lower end rigs to keep playing a while, or just as one user-optimization option.
I'm less fine with dev's pushing it as almost the only user-optimization, where fiddling with (other) settings more and more often does very little to increase performance, plus at least on paper listing DLSS etc. as something you need on for the "system requirements."
I love me some shiny graphics at times, although I've never cared about "Ultra" at all, but could we focus more on gameplay now instead of marketing uber lighting/shadows/effects and hyper real (but still uncanny valley) faces and figures while waiting for gpu tech or dev. engine programming skills to catch up a bit?

 

 

 

Edited by LadyCrimson
  • Like 1
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

1 hour ago, LadyCrimson said:

hyper real (but still uncanny valley)

what i really need with modern games is a 'playstation 1 graphics' mode/switch: less strain on my gpu, less strain on my brain because all the nasty uncanny valley crap immediately goes away

Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

So, AMD sponsors Starfield to make sure that the 7900 XTX can beat the RTX 4090 in something and to have a game everyone wants to play bundled with their cards, but forgets to make sure that the game runs well on their CPUs. Given Bethesda's track record of making terrible code, I guess the 7900 XTX edging out the RTX 4090 wasn't even intentional*.

Not going to lie, seeing the 13600K punching out of the entire AMD CPU stack (including the 7950X, 7800X3D and 7950X3D) in Starfield is hilarious, and the 12600K coming within 10% of the 7950X3D is just bonkers - as is the general performance of the game in 1080p. 67 FPS on average on a 5800X3D/RTX 4090 combination? Yikes. If the game at least had something to show for it, but that's visually just... bad. Never mind those horrible plastic faces. Ugh.

*More objectively that is probably a result of being specifically optimized for AMD cards while doing no optimization for any other GPUs, and less of a toggle that goes like if (!Radeon) frameLimit = 0.8;

Edited by majestic
  • Gasp! 1

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

Starfield is built on a 12 year old Frankenstein's monster of an engine if you only count the time when it was called Creation Engine, or a 20 year old engine if you go all the way back to Gamebryo. There have been so many modifications grafted onto the engine over the years, not to mention there's undoubtedly a whole slew of legacy cruft weighing it down that no longer applies to any machine modern enough to actually play the game at something approaching acceptable framerates. I never expected to see particularly good performance. That said, hopefully Bethesda puts in some optimizations because even less tragic numbers like these are still pretty bad:

Given there's no ray tracing those numbers are quite sad.

The conspiracy theorist in me is starting to suspect that upscalers like DLSS, FSR, and XeSS might have been one of the worst things to happen to gaming in a long time. It just seems like games are not getting much better looking while at the same time getting much worse performing. My conspiracy theorist side thinks that maybe devs are shirking their duties in optimizing figuring "meh upscalers will fix the performance". It's similar to how Star Wars writers will sometimes not think too far ahead and write themselves into a corner because they always have the crutch to lean on of deus ex machina-ing themselves out of any situation because the force.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

3 hours ago, majestic said:

*More objectively that is probably a result of being specifically optimized for AMD cards while doing no optimization for any other GPUs, and less of a toggle that goes like if (!Radeon) frameLimit = 0.8;

It would be funny if the CPU performance delta was because they were using one of those Intel libraries that pretty much literally does exactly that.

Otherwise, you don't really need to depart from a variation of Poe's Law* when it's a new title from Bethesda. Though as above, considering the engine's age that it just works™ at all is a bit of a miracle.

*Howard's Law: any performance issues in a Bethesda game can be adequately explained by it being a Bethesda game in the first place.

  • Like 1
Link to comment
Share on other sites

2 minutes ago, Zoraptor said:

It would be funny if the CPU performance delta was because they were using one of those Intel libraries that pretty much literally does exactly that.

According to some modders they think the overall poor (CPU) performance of the game is because Bethesda messed up compiler optimizations. The go-to explanation for the relatively poor performance of the Ryzen CPUs seems to be the bandwidth limitation of AMD's Infinity Fabric. It was basic memory speed before, but recent testing showed that to not be the case, the game actually scales less well with increased DDR5 speeds than other titles.

As for Intel's libraries that check for the vendor string and use slow execution paths for anything that is not an Intel CPU, well, they're not very likely to be used in desktop applications or games, but I would not put it past Bethesda either way, but I'd like to think AMD would at least do a cursory check when they spend a good bunch of money on a product. :shrugz:

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

  • 1 month later...

Well, looks like 2025 is going to be the year of ARM replacing the x86 CPUs.

 

Spoiler

Yeah, that'll probably also be the year of the Linux Desktop, coming yearly since '96 or so.

 

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

4 hours ago, majestic said:

Well, looks like 2025 is going to be the year of ARM replacing the x86 CPUs.

 

  Hide contents

Yeah, that'll probably also be the year of the Linux Desktop, coming yearly since '96 or so.

 

I've been having my own personal year of the Linux desktop since around 2002. As for ARM, can we fast forward to where RISC-V replaces both x86 and ARM? That's what I'm looking forward to.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

  • 2 months later...

This is one of those "well, duh!" tests.

No, really, are you telling me that unless you're hitting the CPU limit in games, it is better value to go with a less powerful CPU and a more powerful GPU than vice versa?

Nice to see with the lower settings in Fortnite, where the better gaming CPU results in much higher frame rates due to both GPUs being far away from being GPU-bound*. So, basically, for any non e-sports scenarios, or unless you're playing Factorio exclusively and can notice the difference between 400 and 700fps, a 13600k/7600 will easily be enough. Go with the higher powered GPU if you're on a budget.

Guess it's nice to have proof to link to.

*Minor caveat, in some cases turning on ray tracing in games can lead to the 7800X3D outperforming the 7600 with the 4070 Ti. But even then it is really minor.

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

I'm pretty sure that for Civ/Strategy turned based games CPU is more important, and builders like city skylines.

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

  • 1 month later...

boot up pc, freezes and crashes after a few minutes

reboot pc, freezes and crashes after a few minutes

restart pc, boot into bios, it's all in chinese

find english language option, stays in chinese

yeah, alright, i can take a hint, i'm clearly not wanted anymore

  • Gasp! 1
  • Sad 2
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...