Jump to content

Recommended Posts

Posted

I'm predicting that by the time they get to either the D (Druid?) or E (Elementalist?) generation, Intel will be pretty close to on par with AMD and Nvidia, at least in terms of hardware capability. Drivers, on the other hand...:shrugz:

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted (edited)

https://www.neowin.net/news/intel-wants-to-convince-arc-is-the-real-deal-shows-off-50-games-in-dx12vulkan-vs-rtx-3060/

Whilst Intel is certainly inflating their numbers in some way, shape, and form (Not to mention their extremely rocky driver issues and delayed launches), the fact that Arc770 should be able to reasonably compete with a RTX 3060 is hardly what I would call a bad start for a flagship card.  

I have a laptop sporting some variant of a 1650 TI and the only game I can't play comfortably is the new Evil Dead game.

If I like the numbers I may just wait for Battlemage.  Big IF, though.

Edited by ComradeYellow
Posted (edited)

 

Bad news here.  I shall remain optimistic for laptop dies though, if they focus on that specifically it might be doable.  I personally can't think of any desktop enthusiast who would want an Intel discrete anyway.

EDIT:

Crap.  More reports coming in, it's not looking good, even for midrange and it's looking like Arc may be completely scrapped after 'Battlemage' if that.  Now I'm just wondering if someone were to theoretically purchase a 'Battlemage' card, would there be continuous driver updates?

 

Edited by ComradeYellow
Posted

What are the odds that Intel makes an announcement about Raptor Lake 1, 2, or 3 days before Zen 4 becomes available? 90%? 95%? 99.999%?

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted
1 hour ago, Zoraptor said:

just shy of 350W power draw...

Is that at idle or load? I'm assuming load of some kind.

... yeah, between that and gpu spikings, and TV/monitors, and everything else, definitely 1200-1500w PSU time. Better overkill (again) then sorry.  :p

  • Like 1
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted
2 hours ago, Zoraptor said:

First 13900k review has dropped- breaking NDA, and in Chinese so large grain of salt.

+41% MT +12% ST (+7% gaming) and, uh, just shy of 350W power draw...

Build a rig with that and a RTX4090 and you won't even need to run heat during the winter. Think of the savings!:shifty:

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted (edited)
3 hours ago, Zoraptor said:

First 13900k review has dropped- breaking NDA, and in Chinese so large grain of salt.

+41% MT +12% ST (+7% gaming) and, uh, just shy of 350W power draw...

Lmao, into the trash it goes if so. They got a little better this generation compared to the complete jokes that were the last handful, but it seems like they're back to their old tricks...

Edited by Bartimaeus
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Posted
3 hours ago, LadyCrimson said:

Is that at idle or load? I'm assuming load of some kind.

That's at load on a Z series board with power limits turned off. Power limit on it's at ~250W so not much more than AMD's 7000 series' (though with their power limits turned off). Its 'TDP' is still 125W though.

  • Like 1
Posted (edited)

Raptor Lake is using 10nm process not 7nm process so this is expected.  Meteor Lake which is being released next year will use 7nm process which should get the power draw down.  That coupled with a 'Battlemage' GPU sounds like bliss, provided the Arc project doesn't get scrapped (I still believe it's a  possibility it could).  Not to mention Intel is planning on going to the TSMC process soon, which is in Taiwan, which is in danger of being invaded at any time.  Intel has a few potential hazards here in their roadmap strategy.

XeSS is coming along nicely BTW

 

Edited by ComradeYellow
  • 2 weeks later...
Posted

Arc A770 priced at 329USD, launch Oct 12. If it actually performed like a 3060Ti that would be at least a decent price. But it will probably perform a lot more like a 3060, and worse in dx11.

Also some Intel benchmarking for their upcoming processors. Fairly clear the 5950x is not actually AMD's premier gaming chip but I guess 'performs ~the same as or worse than a 5800x3d in 5 out of 9 tests' is not the best marketing.

7co8d7k5nfq91.jpg

Plus mark though for starting the y axis at 0 instead of 0.90 like nVidia would have.

  • Like 1
  • Thanks 1
  • Haha 1
Posted (edited)

Didn't they also use only 3200 memory for AMD's parts?

Edited by Azdeus

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted

3200C14 which is reasonably good stuff, more or less equivalent to 3600C16. One notch above the more common 3200C16 or 3600C18 for about 20% more money, and probably pretty representative of what a upper-midrange gamer might buy without going nuts.

L I E S T R O N G
L I V E W R O N G

Posted

He's getting so much mileage out of "thanks Steve".:lol:

  • Like 1
  • Haha 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted

Some of those clips from the Intel presentation were...Konami-esque. You do not want to be like Konami.

Alternatively, you should ideally always be like Konami.

Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Posted

I was thinking that those Intel Arc gpu's would be good for someone like my husband, but then I read that it requires something called resizable BAR that is unlikely to be on older mobo's/cpu's etc. Not keeping track of such things a lot, I had to look it up ... at any rate, that sorta eliminates the idea of the Arc as a budget gpu to toss into an current older rig (vs. when building a new budget rig).

Interesting. Nothing wrong with it, but guess it won't be on hubby's list for a few or several  then.  😄   That said, I wish Intel/Arc at least enough future success to keep them going with it, because we definitely need another competitive gpu maker, especially in the budget-mid range.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

Arc's are a laptop gamers wet dream*, but only a fool would divulge in 1st gen, especially with lacklustre Raptor Lake as it's periphery.

'Battlemage' + 'Meteor Lake' will really rock the mid-range world, especially with render latency, frame-rate adjustments, and frame consistency, just you wait and see.

*admittedly I can't imagine a desktop enthusiasts being excited for Intel's GPU offerings anytime soon, just stick with Nvidia or Radeon if raw power an 16k 100' screens is what you're after, eh.

Posted

Not too surprising from what everyone's spoken of.  Shame it didn't hit the ground running better.

  • Sad 1

Why has elegance found so little following? Elegance has the disadvantage that hard work is needed to achieve it and a good education to appreciate it. - Edsger Wybe Dijkstra

Posted (edited)

A different take.

Not bad but they really need to get their drivers out ASAP, especially for DX11, DX9, and 1080p.  Absolutely blows Radeon out of the water in Ray Tracing LOL.  Looking at the 'Control' benchmarks made me chuckle. 

Edited by ComradeYellow
Posted (edited)

TL;DW: Intel really needs to let a competent company design cards. Better design and better drivers, and it would actually be one hell of a debut. Could argue that it was one, regardless of the state of the drivers.

Edit: For the love of sweet Baby Jesus, can we also please stop making components glow in all sorts of colors? Intel wasted time, effort and money on custom single layer RGB PCBs to make their cards glow blue. Seriously. What the hell? :p

Edited by majestic
  • Like 2

No mind to think. No will to break. No voice to cry suffering.

  • 2 weeks later...
Posted

Intel with the brute force approach. Great gaming benchmarks via absurd power draw.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted

Am I correct in the assumption that the power draw would likely be moderately less than 300w etc. for actual gaming? While modern games do hit more than single core these days they are not typically anywhere near "full bore" across all of them. Thus making the mega power load and cooling mostly a concern for those that want to heavily use certain work programs and the like?

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted (edited)
5 hours ago, LadyCrimson said:

Am I correct in the assumption that the power draw would likely be moderately less than 300w etc. for actual gaming?

That will obviously be on a game by game basis, but for the most part, yes. Not only that, but I suspect that much like with the 7950X, which has only a little less absurd a power draw when going full bore, you could probably undervolt the 13900K and/or set a lower power limit and get 95% of the gaming performance at half the power draw, probably closer to 99% if you're running at 4K.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...