Jump to content

Intel Thread


Recommended Posts

Speaking purely from my current experience with the i7-13700k, even manually just setting the power limit to Intel's specifications (i.e. 253W PL2, cannot do much besides that, as the B660 chipset does not allow for undervolting) does a number for thermals and power draw, while missing less than 2% of performance on Cinebench R23.

Der8auer had a gaming power draw video a while back with varied CPUs because he had like an old prototype CPU to test, and they were pretty similar. An i5-13600 under gaming load is much faster than a 7600X (well, duh) but draws about as much power, although the usual caveats apply, there's some variance in between the CPUs (and in the case of the 7600Xs, up to 40%).

 

On the topic of Asus doing Asus things, I got myself a 32GB Corsair Vengeance DDR5-7200 CL 34 kit. Loading Asus' "XMP I" profile results in having a blue screen every other minute and stability issues like crashing Firefox tabs in between. Loading the actual XMP profile of the RAM sticks, which Asus calls "XMP II" works though. At least, well, it seems like it is stable.

Also thinking about moving to a larger case or using a front mounted 360mm AIO. Even with the really low height Corsair Vengeance without RGB, I can only install and remove the sticks after dismounting the radiator from the top of the case. Now, I'm not fiddling around with my memory all the time, of course, but still, that is kinda annoying. :p Especially in light of perhaps having to reinstall the old sticks, if I can't use the new ones at their full potential I'm sending them back, no point in paying extra for memory speeds and timings I can't reach without introducing instability.

  • Like 1

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

6 hours ago, majestic said:

On the topic of Asus doing Asus things, I got myself a 32GB Corsair Vengeance DDR5-7200 CL 34 kit. Loading Asus' "XMP I" profile results in having a blue screen every other minute and stability issues like crashing Firefox tabs in between. Loading the actual XMP profile of the RAM sticks, which Asus calls "XMP II" works though. At least, well, it seems like it is stable.

Asus being Asus aside, I guess trying to run a 7200 kit on a board only rated for 6000 is a bit of a stretch, so that was probably unfair - had some crashes with the XMP profile regardless, so I dropped the clock speeds a little. So far they seem perfectly stable at 6800 MT/s, which isn't so bad considering that the memory is still pretty far out of spec for the mainboard, and 6800 kits with the same timings cost about as much or more due to useless RGB bling.

Guess I'll keep the sticks, and... I need a better mainboard. Well, time to wait a bit, rumor mill has it that Meteor Lake is once again cancelled on desktop, and that would have been a refresh for LGA 1700. :shrugz:

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

I can't even run 7200 on a z790 board rated for 8000. I mean it boots and runs all day, but y-cruncher stability tests fail. I back it to 7000 and that was working. I ended up refunding the Vengeance 7200 and just oc'd my 6400 sticks to 7000, and that has been my daily driver for the past few months

  • Like 1
Link to comment
Share on other sites

37 minutes ago, Bokishi said:

I can't even run 7200 on a z790 board rated for 8000. I mean it boots and runs all day, but y-cruncher stability tests fail. I back it to 7000 and that was working. I ended up refunding the Vengeance 7200 and just oc'd my 6400 sticks to 7000, and that has been my daily driver for the past few months

I also have the TG contact frame installed, and if I messed up having the right mounting pressure, it could affect memory stability too, but it looks like I can call that a win. I doubt my old Vengeance 5200s can deal with running at 6800. :)

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

On 5/23/2023 at 2:24 PM, majestic said:

I also have the TG contact frame installed, and if I messed up having the right mounting pressure, it could affect memory stability too, but it looks like I can call that a win. I doubt my old Vengeance 5200s can deal with running at 6800. :)

Yeah this 6400 G.Skill set was one of the first DDR5's to be made with Hynix A-Die, so it turned out they can oc pretty far. Could boot 7600 with them, but stability is another story...

  • Like 1
Link to comment
Share on other sites

  • 1 month later...

If the current rumors are true (and the 14700K leaks suggest that they probably are), Intel is going to give the i3-13400 six performance cores. Single core uplift doesn't look good, so the new i7 and i9 are going to be largely uninteresting. The real killer is going to be the 14600k with eight performance cores. Depends on the pricing of course, but the 13600k is already really good value, if the 14600k is in the same pricing ballbark it's going to kill, and the 14100 is going to be hard to beat as a budget option - can still slap the thing on a cheap ass DDR4 board, after all.

Until Ryzen 8000 and the inevitable 7000 price drop, at least.

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

Edit:

Okay, so this is basically the same situation with KrisFix a ways back when he had a batch of defective Radeon cards and asked the community for help, and suddenly everyone was screaming AMD DRIVERS KILL THE GPUS even though KrisFix himself never mentioned a single thing about the drivers killing the cards in his video, only that the drivers were the only constant between all the cards he got.

So, basically, here's the original video:

A rather niche situation that may as well be a real issue (hard to tell because DPC issues can be really wonky and caused by myriad reasons), and commenters and others pick this up as - literally - "Intel systems lag like cheap iPads". A well, serves me right for not watching the actual source in the first place. :p

Also, the, uhm, conclusion is not necessarily correct. DPC issues like that aren't necessarily just the CPU - could be another driver. The Intel Rapid Storage System drivers can cause pretty high DPC latency too. *shrug*

Edited by majestic

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

  • 2 months later...
2 hours ago, Keyrock said:

Steve taking the piss.:lol:

tl;dw: It's for all intents and purposes the same chip as the previous gen.

Yeah, for most of them not just for any intents and purposes, but like, well, literally. They're basically 13th gen CPUs that won the silicon lottery and can reliably be boosted a little higher than before. On the bright side, that might make them interesting for overclocking competitions. For everyone else, that's a hard pass, especially since the 13th gen CPUs are cheaper. Pretty sad showing.

Steve had it right, this really looks like an investor/shareholder appeasement release. "New" gen resets MSRPs too, making the suits happy, at least until they see that sales figures for this "generation" are in the dumps. By then Arrow Lake is hopefully out.

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

Man, I thought the 14700k review yesterday was already bad, the 14900k is straight up a scam.

  • Gasp! 1
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

Nah, wait for the 14600K review, that one will show zero difference from the prior generation. Originally thought I'd be upgrading to a better mainboard with this release, but that is going to be a :no:, the 14700K isn't worth spending any money on, and that goes doubly so for the others. I'll just wait for Arrow Lake and see how that will turn out.

There's a marginal chance that the lower tier CPUs will be interesting. Leaked benchmarks show more of an improvement between the 13400 and 14400. I don't know if the 14100 is going to be just another Alder Lake with slightly higher clocks (IIRC, the 13100F just had four Alder Lake cores, making it wholly uninteresting for budget use compared to the 12100F), or if it will have actual Raptor Lake cores, which would come with a decent boost in cache size.

Still don't quite get what Intel is doing there. Granted, for 13th gen, it was about catching up to Zen 4 and surpassing the 5800X3D in gaming, which made some sense at the time, and an all core workload on a 13900K does not draw that much more power than that of a 7950X (30 watts, according to Steve's charts - which is 10%, yes, but it's nowhere near the ludicrous difference between the gaming power draw of a 7800X3D and a 13900K).

Dropping 100mV makes all core workloads on a 13900K cap out at roughly 250w. Still nuts, but on par with the 7950X. Should have just collected the parts that undervolt the best, slap an E at the end of the CPU, undervolt them by default and call it the Raptor Lake Efficiency refresh. Jayz2Cents recently showed off an undervolted 13900KS that reached 42k points on Cinebench R23 without ever surpassing 80° core temperatures (hat was his Thermal Grizzly KryoSheet test).

'Course, that would have made the more expensive mainboards fairly pointless. Which... oh, right, which answers that.

Well, let it never be said that Intel's marketing is any better than AMD's.

  • Like 1

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

5 hours ago, Bokishi said:

14900K looks like a letdown, like literally just a higher binned 13900k with some slight oc 

Yeah, like I mentioned earlier, there's something to be said for being able to buy CPUs that won the silicon lottery, at least if you're into extreme overclocking. Predictably, the 14900K almost instantly took the crown:

https://hwbot.org/benchmark/cpu_frequency/rankings#start=0#interval=20

For everyone else though... yeah, nope.

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

Applying death of the author to product naming is a new one for me, but it does make sense. Where's Lindsay Ellis to help me navigate through computer components and corporate marketing? tl;dr: bad, everything intel is doing with the "14th gen" is bad, it's bad

Edited by Bartimaeus
death to the author -> death of the author
  • Like 1
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

20 hours ago, Bartimaeus said:

Applying death to the author to product naming is a new one for me, but it does make sense. Where's Lindsay Ellis to help me navigate through computer components and corporate marketing? tl;dr: bad, everything intel is doing with the "14th gen" is bad, it's bad

AMD thinks they've won the number wars with their admittedly brilliant strategy of including a "X" which functions as the multiplication symbol, leaving Intel in the dust. However, there is one trick left up the sleeve if Intel is wise enough to employ it. Just picture it: Intel i9 - 15900

Checkmate atheists.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

It's clearly an upper case K rather than a lower case k though. Upper case K --> Kelvin, or absolute temperature so it's Good Guy Intel accurately representing that their chips run at 2+ times the surface temperature of the sun.

(Some might say it's that way because they just use all caps because they want to shout about how good their chips are and it actually is the abbreviation for kilo. To short circuit any such discussion may I point out first that the chips are lower case i5/7? Checkmate, Inteltheists)

Edited by Zoraptor
  • Haha 4
  • Gasp! 1
Link to comment
Share on other sites

As for the question of what the hell are modern Intel CPUs doing with all that power they draw, the answer is a big load of nothing, at least in gaming. A German channel whose videos I watch every now and then has tested the 14700K with four power limits, in Cyberpunk 2077, Assassin's Creed: Mirage, Rainbow Six Siege and Far Cry 6.

The result, in 1080p, shamelessly copied from the video:

image.thumb.png.a16e4cd017a99392f3276e987ba454e5.png

Limiting the power draw of the i7-14700K to 80W still only drops average frame rates by 4.6%, while limiting it to 120W gives basically the same results as having no power limit (although it makes no sense to limit it to 120W as that is the average power draw in gaming anyway). This difference is less for 1440p and goes away completely in 4k, which makes sense, as the GPU is much busier.

It does cut heavily into all-core workloads during any sort of productivity tasks, obviously.

Stands to reason that the same is true for the 13950K, oops, the 14900K. Which tracks with what Der8auer tested with the 13900K, where limiting the CPU to an 80W load pretty much yielded the same result. No really noticable drop in gaming performance, but a massive gain in efficiency.

Edit: Well, I suppose that is the reason why Intel gives these CPU a long term powerlimit (PL1) of 125W, with boost up to 253W.

Edited by majestic

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

4 hours ago, Gfted1 said:

Seeing as the picture credited to Intel shows 8 games and not only 2 as mentioned in the article, I'm guessing it hasn't received much advertising/mention because it's not fully ready yet. Though this begs the question of whether this could be used to speed up at least 13th generation CPUs as well, seeing as there's no architectural difference between the two generations...ooor whether Intel will hold on to it for exclusively 14th gen and later as a kind of weak trump card to make up for 14th otherwise being so bad.

Edited by Bartimaeus
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

I think I read somewhere that Intel will eventually have stacked cache like Team Red's X3D chips. If so, that's a good move. The X3D chips have been a big success for Team Red so it would be smart for Team Blue to copy take inspiration from that design.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

3 hours ago, Bartimaeus said:

Seeing as the picture credited to Intel shows 8 games and not only 2 as mentioned in the article, I'm guessing it hasn't received much advertising/mention because it's not fully ready yet. Though this begs the question of whether this could be used to speed up at least 13th generation CPUs as well, seeing as there's no architectural difference between the two generations...ooor whether Intel will hold on to it for exclusively 14th gen and later as a kind of weak trump card to make up for 14th otherwise being so bad.

Intel's statement for why the software only works with 14th gen CPUs is basically not enough testing for others, so yeah, Intel for sure does not consider this completely ready for launch yet. It also does not work with the 14600K, so there's that. Clearly though, the correct marketing move would be to limit it to 14th "generation" CPUs.

2 hours ago, Keyrock said:

I think I read somewhere that Intel will eventually have stacked cache like Team Red's X3D chips. If so, that's a good move. The X3D chips have been a big success for Team Red so it would be smart for Team Blue to copy take inspiration from that design.

Here's a video from HighYield about the tech behind Meteor Lake and what Intel is doing going forward, and the video is a little older, so there may be some inaccuracies (speculation/info on used process nodes might be outdated):

Intel's L4 cache for Meteor Lake is actually in the silicon interposer, rather than stacked on top of the compute tiles, where it is accessible by all components of the CPU. It would be technically possible to assign it to the integrated GPU instead of the CPU when needed. It is probably not as fast as giving the CPU tile direct, shorter access to its own dedicated larger cache, but with the design it's also possible to drop a completely separate cache tile on the interposer if necessary.

Either way, both tiles and the 3DV-cache are elegant solutions to the problem with SRAM structure sizes. SRAM stopped to scale with TSMC's newest process nodes. Caches aren't getting any smaller going forward, so they need to go somewhere else.

That said, the Meteor Lake CPUs aren't going to be any faster than their Raptor Lake counterparts as they're basically "just" Raptor Lake CPUs on a new process node with the new tile design. Just hopefully a lot more efficient. Well, the eight ARC compute units will probably kick Raptor Lake's iGPU to the curb, but if that makes gaming on an iGPU viable stands to be doubted. Although it's not impossible with XeSS turned on where supported.

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...