Jump to content

Recommended Posts

Posted

Huh.

10-20% performance gains in gaming using CU-DIMM DDR5 and not changing anything else. Also probably some e-core task scheduling issues at work here and there, with another couple % added by a e-core and cache overclock (which makes a lot of sense when the e-cores are working on game threads, as the e-cores have comparatively little L2 cache per core, sharing 4MB per cluster of 4). Except for Horizon Zero Dawn, that game has always been really rough on both Intel and nVidia hardware.

That might also point towards why some German YouTube channels that I have watched, including der8auer's, showed so drastically different gaming performance (especially in Cyberpunk) from GN or HUB and Jay's. der8auer used CU-DIMMs for his testing.

Still doesn't make Arrow Lake interesting, but it does show that Intel is having some teething issues with the new platform and their new tile designs. That is a rough launch, and for optimal performance you need different memory kits that are currently at a premium, and it makes little sense. Using CU-DIMMs shouldn't even make that much of a difference. Sure, they theoretically allow for higher memory bandwidth, but really, such a drastic performance increase with everything else equal? Something's wrong here.

And since Jay also mentions it, the 9000X3D CPUs are going to top the charts soon anyway. Given his remarks I suspect the rumors are true and AMD actually really doubled the amount of cache for the 9900X3D and 9950X3D by slapping it on both CCDs. Otherwise, given the otherwise small performance gain of Zen 4 over Zen 5, his insinuations make no sense, unless the regular Zen 5 CPUs have a severe bottleneck and the 9800X3D also shows massive performance gains by simply having more cache to work with.

Well, the specs say there's less of clock speed regression on the 9800X3D compared to the 9700X than there was between the 7800X3D and the 7700X, but that should not be enough to make all these charts look small.

  • Gasp! 1

No mind to think. No will to break. No voice to cry suffering.

Posted

Let's see, it's been ... over 5 years since I built my current rig, maybe? Based on stuff like this, I'll be hanging on to it for another 2-3+ years.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted (edited)
1 hour ago, LadyCrimson said:

Let's see, it's been ... over 5 years since I built my current rig, maybe? Based on stuff like this, I'll be hanging on to it for another 2-3+ years.

Especially since you're playing on 4K, where the CPU is rarely ever the bottle neck. Even with the fastest gaming CPUs (i.e. the X3D CPUs from AMD) you'd benefit more from a GPU upgrade. It's a time honed tradition in PC building. Without an unlimited budget, save on the CPU so you can buy the next best tier of GPUs.

Even with a 4090, you'll see zero difference in frame rates in 4K in most 3D games between the last three generations of CPUs (unless you have something like a really cheap four core CPU that can't keep up). That is why these videos do not test 4K resolutions anyway, and most also dropped 1440p testing, which was a thing in the past. Slap someting on the level of a 7700 (non-X) into your rig and you're probably good for years to come.

It is different for productivity workloads, and you might run into other bottlenecks (PCIE 5.0 support, more PCIE lanes, etc., memory compatibility and availability, etc.) that might make you want to upgrade, but, yeah, well... :shrugz:

Edited by majestic

No mind to think. No will to break. No voice to cry suffering.

Posted (edited)

Another big factor for me is eventually having to decide if I'm going to deal with Win11 or finally try to deal with Linux re: gaming.  Everything I hear re: what Win11 is up to/going to be up to (like that Recall stuff), makes me go "NO" 10000% more than Win7 to Win10 ever did. Maybe it's all overblown but ... sigh. Just an ol' granny who can't program a VCR, these days.

I think the 9900k could be bios-updated to work with Win11, that thing, but not going to bother. If I go with Win11 it'd be on a new cpu/PC.

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted
3 hours ago, LadyCrimson said:

Another big factor for me is eventually having to decide if I'm going to deal with Win11 or finally try to deal with Linux re: gaming.  Everything I hear re: what Win11 is up to/going to be up to (like that Recall stuff), makes me go "NO" 10000% more than Win7 to Win10 ever did. Maybe it's all overblown but ... sigh. Just an ol' granny who can't program a VCR, these days.

I think the 9900k could be bios-updated to work with Win11, that thing, but not going to bother. If I go with Win11 it'd be on a new cpu/PC.

It's overblown in the sense that It's just more stuff to disable. A disable is just a click away if you use Shut Up Windows...

vmware_Ao5Bj2OLvP.png?rlkey=qb2keq4mts54

...which I can't recommend enough alongside Winaero Tweaker for fixing and/or reverting a bunch of Windows 10/11 annoyances/intrusions.

Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Posted

The absolutely most important thing to change in Windows 11 is the stupid new context menu on right clicks. Nothing else comes even close to being this essential. Whoever had that idea over at Microsoft should be shot to the moon without a return ticket, along with the other criminals deserving of such a fate, like Putin, Xi and Trump.

  • Like 3

No mind to think. No will to break. No voice to cry suffering.

Posted (edited)

^ Yeah I disable and got rid of tons of stuff in Win10Pro manually with the group policies and PowerShell. I've only updated Win10 "important files" twice I think, when a game demanded it. I'm sure I'd do the same with Win11. Edit: I always purchase the O/S, vs. free upgrading of previous. If I could buy an "enterprise" version for home I would (well, in the past, dunno about 11).

But hubby says Win11 is also pushing for the MS account login to run it. That you can disable it but hard to find without knowing how. I'm sure they'll start pushing for subscriptions on top before long. 😛  It's just all going to the dogs.

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted (edited)
2 hours ago, LadyCrimson said:

If I could buy an "enterprise" version for home I would (well, in the past, dunno about 11).

You can't feasibly buy either the Enterprise or LTSC editions of Windows 10/11 (subscription-based volume licensing), but both of them sure are easy to obtain the official installers for and use without issue. Don't ask me how I know...or maybe do, depending on how interested you are, :yes:.

2 hours ago, LadyCrimson said:

But hubby says Win11 is also pushing for the MS account login to run it. That you can disable it but hard to find without knowing how. I'm sure they'll start pushing for subscriptions on top before long. 😛  It's just all going to the dogs.

It's pretty much the same situation as it was for most editions/versions of the Windows 10 installer: unplug/disable your internet as you start the installation process and when you get to creating an account, it'll try to make you connect, it won't work because you don't have internet and then you can say "nope, as you can see, I am a poor soul that doesn't have internet", Windows will go "very well, you filthy peasant", and now you can proceed with creating a local account.

3 hours ago, majestic said:

The absolutely most important thing to change in Windows 11 is the stupid new context menu on right clicks. Nothing else comes even close to being this essential. Whoever had that idea over at Microsoft should be shot to the moon without a return ticket, along with the other criminals deserving of such a fate, like Putin, Xi and Trump.

The whole presentation of Windows 11 feels like a MacOS facade that you have to immediately hammer and drill back into its proper shape before you can use it. Stupid.

Edited by Bartimaeus
  • Like 2
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Posted

We can all feel super smart for being able to get around/being nerdy enough to know how to alter things as we want (unlike the average consumer, which is what a lot of companies bank on, imo) but we're sorta still part of the statistical problem as users. Not that everyone has a choice in the matter, I know.

I'm just in a very negative headspace re: current US state of AI/computer/digital dependence and monopolistic/privacy/profit sharing-investor desperation cycles. At least Google's being taking to task on something, recently. Something's gotta give (for a while - a new cycle) at some point.

  • Like 2
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted
3 hours ago, Bartimaeus said:

It's pretty much the same situation as it was for most editions/versions of the Windows 10 installer: unplug/disable your internet as you start the installation process and when you get to creating an account, it'll try to make you connect, it won't work because you don't have internet and then you can say "nope, as you can see, I am a poor soul that doesn't have internet", Windows will go "very well, you filthy peasant", and now you can proceed with creating a local account.

This does not work on win11 or at least it didn't work for me the couple of times I tried it. I had to use a workaround with some command in cmd, to get to the local account creation.

"because they filled mommy with enough mythic power to become a demi-god" - KP

Posted
6 minutes ago, Sarex said:

This does not work on win11 or at least it didn't work for me the couple of times I tried it. I had to use a workaround with some command in cmd, to get to the local account creation.

What edition, specifically? IIRC, later updates of Windows 10 Home behaved similar as well, but I thought the Pro and Enterprise variants were unaffected.

Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Posted
8 hours ago, Bartimaeus said:

What edition, specifically? IIRC, later updates of Windows 10 Home behaved similar as well, but I thought the Pro and Enterprise variants were unaffected.

Pro.

  • Gasp! 1

"because they filled mommy with enough mythic power to become a demi-god" - KP

Posted (edited)
21 hours ago, LadyCrimson said:

I'm just in a very negative headspace re: current US state of AI/computer/digital dependence and monopolistic/privacy/profit sharing-investor desperation cycles. At least Google's being taking to task on something, recently. Something's gotta give (for a while - a new cycle) at some point.

"Fun" video with the Gamers Nexus guy if you hadn't seen it:

Windows (and most other always online products, services, and features) is adversarial, by design, and it is always getting worse.

11 hours ago, Sarex said:

Pro.

On the latest patched version of the official Windows 11 multi-installer from Microsoft.com (including Home and Pro editions, along with some others, IDK, I wasn't paying that much attention to the list of options you get, I just chose Pro), found and freely downloadable here: https://www.microsoft.com/en-us/software-download/windows11

It is admittedly not straightforward to discern what you need to click on in order to not have to do the SHIFT+F10 OOBE stuff, but it is still possible: https://dl.dropboxusercontent.com/scl/fi/psttyrild3n998fzlhnx0/vmware_wFvbyAGVMu.mp4?rlkey=zi9bw7xlkiohwzzzjl4o8ookg&dl=0

Edited by Bartimaeus
  • Like 1
  • Thanks 1
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

  • 1 month later...
Posted

Looking pretty good as far as generational improvements go, considering it has ~38% less XEs than the A770.

When it works well, it's a really good card for its price range (and it pummels AMD in RT performance). Frametime pacing still seems to be an issue and there's the occasional crash. It's a pity Intel needed so long to get it out, what with the new generations around the corner.

  • Thanks 2

No mind to think. No will to break. No voice to cry suffering.

Posted

I suspect it's going to be hard to actually get hold of since the pricing is- compared to currentyear nV and AMD- certainly aggressive. You also have to wonder how many they'll make given it's not likely to have much of a profit margin due to the aggressive pricing.

(The entirety of NZ's preorder stock sold out in minutes which is either an excellent indicator for demand, or rather a bad sign of supply)

Hopefully enough for Intel to stick with their consumer offerings despite their current managerial etc issues.

  • Like 1
Posted

Won't affect nVidia because the 5060 will sell no matter how much VRAM it has, but hopefully we've seen the last of the 8GB AMD x600 cards.

L I E S T R O N G
L I V E W R O N G

Posted (edited)

Everybody be like: "Intel is throwing the budget GPU market a lifeline."

Then I see this and want to hang myself. PS: Even the 40 Euro HD 3650, whilst struggling with most games, still produced over 70fps average at least in Unreal Tournament 3, 1280x1024 resolution (still quite common in 2008, and a 2007 game). Even if you would double those prices..... Srsly, Wot happened? 🤔 😁

image.thumb.png.e0f2ee0dd01366608d9f1bff9bb67f1f.png

Edited by Sven_
Posted
4 hours ago, Sven_ said:

Everybody be like: "Intel is throwing the budget GPU market a lifeline."

Then I see this and want to hang myself. PS: Even the 40 Euro HD 3650, whilst struggling with most games, still produced over 70fps average at least in Unreal Tournament 3, 1280x1024 resolution (still quite common in 2008, and a 2007 game). Even if you would double those prices..... Srsly, Wot happened? 🤔 😁
 

Because no matter what ATi/AMD did, people bought nvidias cards instead, leading to a massively ****ed up market. When AMD finally had cought up on performance, they didn't care about trying to price things right, because they probably rightly assumed that that would possibly lead to nvidia lowering prices, which would lead to people again buying nvidias stuff. So why not price high? And add in bitcoin and now AI bull**** and there's a nice bed that we've made for ourselves.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted

I'm always worried about driver support with getting an intel card...

"because they filled mommy with enough mythic power to become a demi-god" - KP

Posted (edited)
10 hours ago, Azdeus said:

Because no matter what ATi/AMD did, people bought nvidias cards instead, leading to a massively ****ed up market. When AMD finally had cought up on performance, they didn't care about trying to price things right, because they probably rightly assumed that that would possibly lead to nvidia lowering prices, which would lead to people again buying nvidias stuff. So why not price high? And add in bitcoin and now AI bull**** and there's a nice bed that we've made for ourselves.


Not true though, is it? Ati at one point was even in the lead, and even after AMD bought them, there were times when they had decent share. In the past 15 years, I had two AMD cards myself (HD 6670 and HD 6850), plus two Nvidia (GTX 1050ti plus RTX 3060). The HD 6xxxx was really good value (the 6850/6870 offering some of the best price to performance ratio on the market -- both could game anything high for 150-200 bucks). 

I think this is a long-term thing though. AMD have been trailing behind for so long (also technologically), that by now they're considered 2nd tier. When was the last time AMD introduced something themselves? Everything Nvidia does, AMD follows (upscaling, RT, Frame Generation...) Compare that to Ryzen, bringing affordable 6Core/12T CPUs to the masses at a time when Intel were still releasing quad cores. Yeah, Ryzen is where all the money went, but still.

By now, Nvidia is the option to "go to". And ironically, the generally high prices even for entry level cards may actually benefit them. If you spend a couple hundred bucks anyway -- you're more likely to go with what's "safe", e.g. market leader. GPUs are oft used for a few years now too as well. That naturally applies to Intel as well. And even they can't just massively undercut current market prices. Manufacturing has become more expensive as well. Arc B580 is something. But compared to what say Kyro II did WAY back then, having a budget card sometimes perform on the level of cards that cost twice as much (Kyro II did have its weaknesses though)... 

The Card - STMicroelectronics Kyro II 64MB

Edited by Sven_
Posted
6 hours ago, Sven_ said:


Not true though, is it? Ati at one point was even in the lead, and even after AMD bought them, there were times when they had decent share. In the past 15 years, I had two AMD cards myself (HD 6670 and HD 6850), plus two Nvidia (GTX 1050ti plus RTX 3060). The HD 6xxxx was really good value (the 6850/6870 offering some of the best price to performance ratio on the market -- both could game anything high for 150-200 bucks). 

I think this is a long-term thing though. AMD have been trailing behind for so long (also technologically), that by now they're considered 2nd tier. When was the last time AMD introduced something themselves? Everything Nvidia does, AMD follows (upscaling, RT, Frame Generation...) Compare that to Ryzen, bringing affordable 6Core/12T CPUs to the masses at a time when Intel were still releasing quad cores. Yeah, Ryzen is where all the money went, but still.

By now, Nvidia is the option to "go to". And ironically, the generally high prices even for entry level cards may actually benefit them. If you spend a couple hundred bucks anyway -- you're more likely to go with what's "safe", e.g. market leader. GPUs are oft used for a few years now too as well. That naturally applies to Intel as well. And even they can't just massively undercut current market prices. Manufacturing has become more expensive as well. Arc B580 is something. But compared to what say Kyro II did WAY back then, having a budget card sometimes perform on the level of cards that cost twice as much (Kyro II did have its weaknesses though)... 

The Card - STMicroelectronics Kyro II 64MB

I've had ATi/AMD between my TNT2 and the 3080, a X800, HD5850, HD7970, R9 Fury and Vega 64.

Yes, it's true, you can see it in the very chart. They had several high points and great cards, not to mention great prices, but they never managed to get their market share back because of nvidias marketing and bull****tery (Remember tesselation and Crysis? That was fun...), they even lost marketshare to the Fermi rehash. And they certainly weren't technologically behind, they were first on dx11 as I recall. It's no wonder that they stopped competing and just tried to go for the budget market after years of having no payoff from pouring money into flagship products.

That you still call AMD's cards second tier is telling, especially when you consider that the last 6000 series were quite competative in rasterization. The first time nvidia actually took a leap was with real time RT, upscaling and frame gen are great ways to make their cards look even better than they actually are, if you want bleary, smudgy and artifacting graphics that is.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted (edited)
7 hours ago, Azdeus said:

That you still call AMD's cards second tier is telling, especially when you consider that the last 6000 series were quite competative in rasterization. The first time nvidia actually took a leap was with real time RT, upscaling and frame gen are great ways to make their cards look even better than they actually are, if you want bleary, smudgy and artifacting graphics that is.


Well, that's general market perception at this point. E.g. The go-to option for Average Joe is Nvidia at this point.

I think RDNA 2 (6000 series) would have done better if it wouldn't have been for Covid / mining crisis... AMD Vows to Resolve GPU Shortage: We Are Ramping Production | Tom's Hardware Though I think it's odd they paired the RX 6600 directly against the RTX 3060/12GB, it could have done good business where it counts as to market penetration.... which is the entry levels. And the rest was no slouch either.

The current RX 7600 is just all around meh. As is the 4060, mind. Curious what AMD is going to announce next month. Pretty sure that Nvidia is going to continue as usual here, which is trying to upsell the heck out of you. "You want a card with more than 8GB? In 2025? With our new features introduced eating up VRAM too? Well, we've got these bigger models here for ya... "
 

Edited by Sven_
Posted
8 hours ago, Sven_ said:


Well, that's general market perception at this point. E.g. The go-to option for Average Joe is Nvidia at this point.

I think RDNA 2 (6000 series) would have done better if it wouldn't have been for Covid / mining crisis... AMD Vows to Resolve GPU Shortage: We Are Ramping Production | Tom's Hardware Though I think it's odd they paired the RX 6600 directly against the RTX 3060/12GB, it could have done good business where it counts as to market penetration.... which is the entry levels. And the rest was no slouch either.

I think they also would've done better with the 6000 series if they actually had something to say when nvidia released the 30-series. AMD was dead silent just up until release and didn't give people a reason to hold out once nvidia dropped.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted

It helps to not forget that the gaming and DIY market is relatively small, large volumes are mostly moved through OEMs, an area where AMD never really got their foot in the door. It's the reason why Intel still has a massive x86 market share (70 to 80%), in spite of the 13th/14th gen problems and notebook CPUs prior to Lunar Lake being noticably less efficient, and AMD outselling Intel at larger DIY retailers (top sellers in CPUs since forever on Amazon, Mindfactory in Germany or Microcenter, etc., perhaps with the exception of the time when Alder Lake was faster than Ryzen 3 and comparatively good value) since basically Ryzen 1's much better price/performance ratio.

That said, I just checked what I paid for the GeForce 256 back in 1999, and that is, adjusted for inflation, 550€. Graphics card prices weren't always peaches and sunshine back in the day either. :p

  • Like 1

No mind to think. No will to break. No voice to cry suffering.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...