Jump to content

AMD Ryzen


Keyrock

Recommended Posts

Well that sucks. Who is going to buy that?

edit: so wait is the motherboard going to be 16 or 8 ram slots and what format is it going to be? Doesn't seem like it will fit even the full tower cases.

Edited by Sarex

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

28 minutes ago, Zoraptor said:

14 series won't be HEDT(?) I don't think they're even having a desktop version(?) It would be Arrow Lake/ 15 series before a consumer chip based potential HEDT from Intel.

I meant the regular lineup, like a 14900k or something along that line.

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

Dragon Range and Phoenix APUs should be in the wild shortly. Phoenix are the chips I'm interested in. I imagine they'll show up in ultrathin laptops before they show up in microPCs and handhelds (which is what I'm potentially looking at). The 1080p handheld dream is (potentially) almost here. /rubs hands together

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

TL;DW remains the same. [X] SKIP, this time Steve even agrees on account of this not topping any charts and therefore not getting a "chart topping bonus" (that no reviewer should give out anyway and just leads to 13900K CPUs).

Anyway, for gaming, as expected, the 5800X3D still makes all of the new ones look like bad value for the money. Well, unless you really want those 850 FPS in Rainbow Six Siege and CS:GO, but in those cases one should either get the 7950X3D or an i9-13900, not this... whatever it is supposed to be. :p

What was it Steve said in the video, you're paying 15$ per 1% performance uplift. That's just bad value. Guess we now know why AMD didn't provide review samples. In games where the extra cache helps, it helps up to 98% of the performance of the 7950X3D, but the results are really incosistent. Looks like AMD really took 700Mhz off the upcoming 7800X3DX in order to not kill their own higher end CPUs in gaming. Let's see how well these things sell. If they do well enough, at least the AMD fanboys will have to shut up about nVidiots buying overpriced trash that was intentionally gimped to fleece gamers.

Edited by majestic
  • Like 1

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

eh, we already have the 7900XT as overpriced trash. Or had, amazing what selling under MSRP does for value. Really though, the situations in the GPU space with nVidia and CPUs with AMD aren't even close to comparable until AMD has 80% market share.

Link to comment
Share on other sites

That was a jab at the fanboys, not so much a direct comparison, although we're looking at an MSRP increase of 55% over the 5800X3D for a 20% uplift in gaming performance (sometimes less). When looking at street prices that becomes even worse (that's 800€ vs. 300€ at the moment), and unless you're not GPU bound, that's just going to be even worse in real world scenarios.

No, market shares aren't comparable, and CPUs aren't GPUs, but the pricing is ridiculous. :p

And then we're not even talking about making the players use the XBOX Game Bar because the performance uplift over the normal equivalents would otherwise not be guaranteed. Yeah. :smh:

Edit: never mind AMD turning off half the cores of these CPUs in games, which is the entire reason these CPUs exist in the first place. Kinda baffling. Why not just launch an uncrippled 7800X3D and leave it at that...

2 hours ago, Zoraptor said:

Or had, amazing what selling under MSRP does for value.

Indeed, in the meantime, the 7900 XT and the 4070 TI sell for the same amount of money here, which makes the 7900 XT look good - comparatively, at least. The 7900 XTX though, that's... still going for as much as a RTX 4080.

Edited by majestic

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

7900XT is 250NZD cheaper than the cheapest 3070Ti here, or $100 cheaper for an AIB. Think it's still hard to argue it's good value in absolute terms, but relatively it is.

Top end pricing is always stupid whether it's CPU or GPU. The top x3d chips for 7000 were always likely to be bad value for money because every top end chip is bad value for money. Even if they'd had both ccds with the 3d memory* how many games or other applications would use 12-16 cores? Similar for Intel's top end offerings. Platform costs for 7000 series are also still ludicrous.

*Really though, the top two skus always looked like they would end up being pretty stupid once it was known they'd one have 3d cache on one ccd only. You can make a case for the 7950x3d for certain use cases, 7900 version... maybe, if they'd had a 8x3d+4 vanilla set up, but that was never going to happen practically. End result, yep, have to gimp the 7800x3d so it doesn't out perform the 7900 version and at that point you'd be better off not doing the 7900. Or selling them as 7600x3d instead, maybe. Wouldn't be overly surprised if we end up with them and the 7900x3d retired at some point, dependent on how easy they are to implement.

Link to comment
Share on other sites

Theoretically, couldn't one ungimp the 7800X3D via ununderclock overclock?

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Just now, Keyrock said:

Theoretically, couldn't one ungimp the 7800X3D via un-underclock overclock?

Theoretically, but practically they're just going to lock it, like the others.

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

32 minutes ago, Zoraptor said:

7900XT is 250NZD cheaper than the cheapest 3070Ti here, or $100 cheaper for an AIB. Think it's still hard to argue it's good value in absolute terms, but relatively it is.

A 3070 TI would run me 600€ vs. the 900€ for a 7900 XT, but I already got a 3060, so getting a 3070 TI for that amount of money isn't the best choice. There was a point in time when rebates made the 4070 TI dip below 900€, I should have bought one then. Looks like I'm going to ride this gen out, unless something magical happens and the 4060 type cards suddenly offer much better value. Really doubtful.

Edit: Wee, double posted, meant to edit this one into the other reply, bah.

Edited by majestic
  • Gasp! 1

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

46 minutes ago, majestic said:

A 3070 TI would run me 600€ vs. the 900€ for a 7900 XT, but I already got a 3060, so getting a 3070 TI for that amount of money isn't the best choice. There was a point in time when rebates made the 4070 TI dip below 900€, I should have bought one then. Looks like I'm going to ride this gen out, unless something magical happens and the 4060 type cards suddenly offer much better value. Really doubtful.

I meant 4070Ti. If the 7900XT was 250NZD cheaper than a 3070Ti even I'd be buying one.

(It's a little over 800€ equivalent for reference 7900XTs here. Which puts it at 50USD under USMSRP despite the long journey and small market)

Link to comment
Share on other sites

  • 1 month later...

Even with the gimped core clock speeds there are games like Far Cry 6:

image.thumb.png.d8f55ee3c725b8c050ec5b5cec1cd48a.png

I sure hope everyone who bought the 7950X3D needs them for production workloads too, because otherwise they might feel a little, say, uhm, silly?

7 minutes ago, Keyrock said:

The new gaming king, though the 13600K is a better value if you're on a budget.

Quoted for truth, if one's primarily a gamer, one is better off saving money on a current mid-range Intel CPU or getting a leftover 5800X3D on AM4 and investing the money saved on a better GPU. Not using an RTX 4090 would move the results even close together than they already are, unless we're talking about games like Factorio, which obviously are going to benefit more from CPU upgrades than anything else (but even then we're talking about practically useless, if really impressive, improvements).

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

4 hours ago, majestic said:

if one's primarily a gamer, one is better off saving money on a current mid-range Intel CPU or getting a leftover 5800X3D on AM4 

Yeah, the 5800X3D is STILL a really great gaming CPU. It might be the single best CPU AMD has put out, relative to the landscape around the time it came out, since the glory days of Athlon 64.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Yeah, the new products are fine but the days of AMD pricing CPUs like they have something to prove have been over for a while now. They've won back enough goodwill that underdog pricing is just leaving money on the table.

Edited by Whipstitch
Link to comment
Share on other sites

  • 2 weeks later...

TL;DR: Up to 40% variance in power consumption during gaming on 7600s. Obviously much less on all-core workloads because those run at the power limit anyway, but that is pretty brutal for playing games. Would be interesting to see a larger sample size, but two bad outliers out of 13 tested CPUs, yeah, those aren't the greatest odds. Expected, as these CPUs - just like comparable ones by Intel - are basically whatever couldn't be sold for more, but still 41% difference under a regular gaming load is pretty brutal, especially if you pick one of those lower end Ryzen CPUs specifically because they're much more power efficient than their Intel counterparts*.

*Which in reality is not something one can glean from the all-core workload charts anyway. There are significant differences in idle power draw, for instance - if your computer is mostly intended for lightweight activities, i.e. browsing the web, watching a video here and there and do some office work, the better you're off with Intel - not only do you not need the more expensive AM5 platform, but the power draw difference on idle is something between a factor of two and three - also used to be true for idle power draw of Radeon cards compared to nVidia's, but apparently that was fixed in a driver update. Anyway, Intel really shot themselves in the foot with the 13th generation aggressive boosting policy just to 'win' the benchmark charts. Manually playing with the power limits can make 13th gen CPUs perform the same as their AMD counterparts (X3D in games nonwithstanding) for about the same efficiency. There's a German video of Der8auer for this too, but yeah, it is German. If a 13900K is set to eco mode by manually limiting its power limits to 95w, it performs almost identical to a 7950x in eco mode, while both draw, well, 95 watts. Really, efficiency should be the baseline, and getting a bit more speed out of your CPUs for significantly more power draw should be the setting to be done manually Intel, not vice versa. *sigh*

 

Edited by majestic
  • Like 1

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

The problem with efficiency is... well, if you're after efficiency you should pretty much only consider a chip from a plucky start up company from Cambridge UK, or one of its many derivative designs, rather than anything from Intel/ AMD (though both, of course, have an ARM license and indeed AMD's NSA back door vector  PSP is actually ARM rather than x86). You certainly shouldn't be using a 7600+ though at least unlike most earlier Ryzens it has 2xRDNA cores so you save on a mandatory GPU for browsing etc. On desktop gaming though efficiency is just nice to have for most people, it's nowhere near the top factor. That's Intel's calculus for its ludicrous top end power inefficiency.

OTOH if they found the same variation on laptops it would potentially be massive since they often have their utility limited significantly by power draw. Speaking of which:

8 hours ago, majestic said:

If a 13900K is set to eco mode by manually limiting its power limits to 95w, it performs almost identical to a 7950x in eco mode, while both draw, well, 95 watts. Really, efficiency should be the baseline, and getting a bit more speed out of your CPUs for significantly more power draw should be the setting to be done manually Intel, not vice versa. *sigh*

While I 100% agree with the latter there's a pretty big difference in draw/ efficiency there's a certain irony in saying that a single chip analysis shows similar efficiency at 95W, given the context of the rest of the post.

(I mostly nit pick this because analysis on laptops of the equivalent mobile chips shows a pretty significant efficiency advantage for the, er, 7945HX over the, um, 13980HX, on performance per watt basis)

  • Haha 1
Link to comment
Share on other sites

35 minutes ago, Zoraptor said:

While I 100% agree with the latter there's a pretty big difference in draw/ efficiency there's a certain irony in saying that a single chip analysis shows similar efficiency at 95W, given the context of the rest of the post.

Indeed, yes, it could have been an outlier on both ends, making the comparison somewhat lopsided, but variation within high end components is much less pronounced, so the chance is lower, but it is not impossible that it could have been a really good 13900K and a really bad 7950x, of course.

35 minutes ago, Zoraptor said:

On desktop gaming though efficiency is just nice to have for most people, it's nowhere near the top factor. That's Intel's calculus for its ludicrous top end power inefficiency.

Well that changed for some parts of the world, somewhat recently, although admittedly not everyone gets as fleeced by their energy providers as we Yuro-Peons do right now.

Edited by majestic

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...