Jump to content

Intel Thread


Recommended Posts

Seems like it needs Win11 to shine, at least from Tom's

Edited by Malcador

Why has elegance found so little following? Elegance has the disadvantage that hard work is needed to achieve it and a good education to appreciate it. - Edsger Wybe Dijkstra

Link to comment
Share on other sites

I knew full well these hybrid chips would be solid performers, and it doesn't look like AMD will achieve hybrid status until Zen 5, giving Intel (As well as Apple) a solid head start there.

Zen 4 will have to beat out Alder Lake by the slimmest of margins to stay ahead of the game until they start churning out their versions of hybrid chips, though AMD seems to be clever enough to implement their architecture to do so.

Edit: It looks like 'Raptor Lake' is their next years line-up.  It was supposed to be 'Meteor Lake' but I guess they're not ready for the 7nm process yet.

I called it about a year ago and I'll call it again and you can quote me later if I'm wrong: Meteor Lake will be their real killers and personally would hold off upgrading until Meteor Lake and whatever Ryzen competition comes out then.

Personally, I'm peeved as Hell with desktop hardware and all the bits you have to buy to upgrade and am quite happy with more modest, smaller, yet surprisingly efficient laptop variants that are being released lately.  Intel is trying to master the art of the increasingly faster and more efficient discrete GPU's, ensuring that Nvidia and AMD will be on their toes in that department as well.

Edited by ComradeYellow
Link to comment
Share on other sites

I know the point is it's good for competition, but my 9900k is still underworked/yawns at anything I've thrown at it.

By the time I need a new CPU, desktop PC's are going to have to be stored in a freezer so they won't explode.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

1 hour ago, LadyCrimson said:

I know the point is it's good for competition, but my 9900k is still underworked/yawns at anything I've thrown at it.

By the time I need a new CPU, desktop PC's are going to have to be stored in a freezer so they won't explode.

My i7-4770k lasted about six years, which is probably pretty close to how long this new 5800x will last me as well. Still, competition's good for both prices and driving these companies forward. I shudder to think of where we'd be right now if AMD had continued to languish after Bulldozer and Intel was still going "ha ha ha, 4 cores and 8 threads is good enough for anybody and everybody...unless you'd like to spend $700 on our terrible extreme series models?".

Temperatures are getting kind of crazy, though - that's definitely been a trend as of the last ~5 years. I can cool my 5800x with a ~$60 air cooler just well enough, but if we ever come to a day where all the higher-end CPUs require water-cooling in order to get even baseline performance, that'll be exactly when I stop getting the higher-end CPUs. Maybe we'll have better cooling technology that doesn't involve water by then, though...

Edited by Bartimaeus
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

It's entirely possible that I never build another rig again. What I went through nearly a year ago to build my latest desktop, I never want to do that ever again. The shortage and scalper situation seems like it's gotten slightly better since then, but it's still terrible. I'm starting to worry that this never fully goes away, that even when supply increases, assuming it ever does, and given the constantly increasing demand for silicon for devices of all kinds that's no guarantee, that scalpers will just buy more and still artificially create shortages.

Even without scalpers, component prices have spun out of control. I know inflation is a thing, but prices are increasing much faster than inflation. I may just become a dirty console peasant. Consoles are often sold at a loss for the manufacturer, they make their money elsewhere, so you typically get much better value for the level of hardware you get, plus console games tend to be better optimized because the dev has 1 or 2, maybe 3 tops hardware configurations to optimize for, rather than potentially thousands with PC versions, so they can be a lot more specific.

Honestly, console graphics have gotten so good that it almost doesn't matter. I was just playing Metroid Dread on the Switch, the lowest power current console out there, and that game looks great. If that game was 4K with ray tracing and extra god rays and volumetric fog or whatever would it look better? Sure, but how much would that have enhanced my experience? Nearly nil. 

Maybe I'll feel differently in a couple of years, but right now I'm so ****ing done this bull****.

Edited by Keyrock
  • Sad 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

They just need to ban crypto mining. I'm pretty sure it will happen once they figure out how much power is wasted on imaginary numbers... Then again imaginary numbers is the whole world economy at this point.

  • Sad 1

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

2 hours ago, Sarex said:

They just need to ban crypto mining. I'm pretty sure it will happen once they figure out how much power is wasted on imaginary numbers... Then again imaginary numbers is the whole world economy at this point.

China has done away with crypto-mining, not sure if/when the West will do the same.  I mean it's all these high powered GPU's are good for really, 3090 does wonders with Ethereum mining especially if you live in an apartment where you don't have to pay for power.  Exchange it for Bitcoin and you're good to go...to wherever shady .onion website that's totally NOT infested with law enforcement and cyber criminals, lol.

Link to comment
Share on other sites

Sweden is looking to forbid mining aswell, because of the greenhouse effects. Funny how they don't use this arguement against the stock market, but oh well...

  • Like 1
  • Haha 1

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

I don't think I feel quite as strongly about it as Keyrock, but I am of the same mind. Keeping up with PC's as well as costs is just becoming too silly. Not that I ever kept up with PC tech religiously or anything but.

Still, I like the ability to mod and use cheats, the ability to have or more easily swap around hundreds of saves, and the video and screenshot capabilities on PC.  I'm sure every console will only get better at those things eventually too but ... yeah. I expect to build at least one more PC (or at minimum, just a GPU) to keep up with 4k/60 on "next gen" games but that's still a few years away. After that ... eh ... I don't think I game enough on average anymore to care - especially AAA gaming. Might just go with PS6 by then.

  • Like 4
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

Yes being able to do everything digital on one device from gaming, to banking, to surfing strange corners of the net, is quite efficient, and with laptops you get to lug it everywhere with relative ease.  Sure, you don't get the massive screen but with a laptop 1080p is just as flashy as a 4k or even an 8k if it's small enough, lol.  I know it's un-American not to be obsessed with huge ass screens but eh, I'm a professional discreet fancy-pants at heart I think.

I just found out I can make $35/d with this 3090 mining Ethereum so that desktop over there can mine while I do everything else over here where its more comfortable and fun.

Link to comment
Share on other sites

  • 4 months later...

Intel's Arc discrete mobile GPUs have launched:

https://hothardware.com/news/intel-arc-mobile-gpu-launch

No word on when we'll see laptops in the wild with these GPUs, but it should be fairly soon. I imagine some OEMs already have their hands on these GPUs.

small_intel_arc_a-series_SoCs.jpg

small_intel_arc_a_3_5_7_specs.jpg

Interesting that they put ray tracing cores in every model, even the low-end ones. I imagine it's mainly a marketing thing. I mean, even with something like DLSS or FSR I can't see a 35W GPU doing ray tracing at anything beyond slideshow framerates.

Anyway, I'm looking forward to seeing 3rd party benchmarks on laptops with these GPUs.

Edit: Wow, they only support HDMI 2.0b and DisplayPort 1.4a. That's an interesting design choice. No HDMI 2.1, despite the fact that HDMI 2.1 TVs and monitors have been available for over a year. Not that big a deal for laptops, but it's concerning for their upcoming desktop models.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Raytracing isn't really surprising- Exynos has it and they're for even lower wattage devices albeit on supposedly a far better node than Arc (w/ RDNA2).

I'd suspect that despite RT cores being listed separately they're baked into the core design, ie they're intrinsic and you can't exclude them.

Edited by Zoraptor
  • Thanks 1
Link to comment
Share on other sites

Going back on what they had said previously, Intel has now revealed that the initial version of XeSS, their answer to DLSS and FSR, will only work on Intel Arc GPUs with their XMX technology. The version of XeSS that will work on any GPU is supposedly coming at some undefined future date.

Nvidia could do that with DLSS because a) they were and still are the market leader b) they were first to market. AMD wisely made FSR vendor agnostic, had they not FSR would likely have been DOA. Why would any game dev spend the time and resources to implement XeSS in their game, a feature only a small percentage of PC gamers could use, when they could instead either implement DLSS, which a much larger percentage of gamers could use, or FSR, which everybody can use? Unless Intel throws a sack of money at them, no game dev is going to bother with XeSS.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Classic Raja Koduri over promising, nice to see he's brought that over from his time at AMD. Guess he's avoided an "Intel ARC will be called Intel ARC" moment at least.

XeSS always seemed likely to be at least a bit of a marketing stunt to imply a full suite release comparable to the competitors- it probably had the wind taken out of its sales somewhat by FSR2.0's release as well.

Link to comment
Share on other sites

Intel is a very insular company so of course they would make XeSS an exclusive like DLSS, and it shouldn't be hard to give future devs the tools to implement them. They are rich enough where's they can establish a fanbase where a large enough demographic will be die hard loyalists at any rate, especially in such a market driven society.

Link to comment
Share on other sites

  • 3 months later...

Probably about what might realistically be expected from a first gen release.

Positives... hmm. It clearly has driver issues (not at all surprising), that may get fixed? and at the top range of its performance arc (hoho) it's worthwhile*? The other cards in its price range aren't exactly inspiring either, and have weird design decisions like only using 4 PCIe lanes?

Not great when the takeaway is ~ buy a 2nd hand 580 though, but then that isn't exactly an endorsement of nVidia or AMD either. Bad news for Intel if both are oversubscribed for wafers too, as that would likely accelerate the release of new lower end cards.

*of course at the bottom it's an overpriced power hungry 1030, but I'm trying to be positive.

Link to comment
Share on other sites

ARC seems to have a ton of issues to work out, but that's to be expected. Intel has to start somewhere. If they can compete on equal footing with Nvidia and AMD within 5 years I'll be impressed.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

  • 4 weeks later...
  • 4 weeks later...

That's always going to be the problem when you have a reputation for killing projects as Intel- though not to the extent of Google, of course- has; people are constantly going to speculate about the future of anything that doesn't make an immediate profit. And it has been ~5 years with not much to show for it.

But really, there's no better way to tank your value as a company than to appear moribund and bereft of ideas. Intel's already shown some vulnerability in its own core server/ consumer CPUs and foundry business and really needs something proactive to show they have ideas for growing instead of just maintaining what they have. They couldn't get into the mobile (phone) market at their peak, so GPU is the obvious option. It's just going to cost them a lot to get established and still may not work; and you'll always have people who think that money could be better 'spent' on dividends.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...