Sven_
Members-
Posts
320 -
Joined
-
Last visited
-
Days Won
4
Content Type
Profiles
Forums
Blogs
Everything posted by Sven_
-
I'm gonna wait 'til RTX 5090 performance* can be had for 300 bucks. Which means roughly March 2041... progress, baby! * without frame generation!
-
Got my Ryzen 5 5600 (ordered for KCD II!). Been a stock cooler user since forever (haven't overclocked since Athlon Thunderbird days). Gets a good deal warmer at stock than the Ryzen 3 I had installed before though (Peak during stressing it 76-81 degrees Celcius). Via PBO, I don't seem to have that much room for undervolt, but doing so slightly plus changing the PPT from the standard 77W down do 67W seems to make barely a difference to performance, whilst still saving on a few Watts and degrees. Eco mode would only limit the PPT even further (60W), but no undervolt. Eco seems like a ~5% performance hit in CPU-Z's multicore benchmark, going down from ~4800 points to ~4550. With the undervolt from Paragraph #1, temperatures and draw are still naturally higher, but the CPU is still running more efficient, only losing like ~20-50 points in the bench.
-
Always do! The killers in KCD where shadow and object detail settings. Keep those at high at best, and there's minimal quality loss, but a lot more fps. Seeing that KCD2 has the same options (Worth A Buy has shown them in a prior video), I know where I'm gonna start out. A week and a half to go! By the way, second best rant on the internet. Every time I'm in a bad mood (and/or play a game that treats me like a toddler.....)
-
This content cannot be displayed until agreeing to our use of Social Media cookies. Learn more. Manage cookies Considering that: i9 9900K = pretty much Ryzen 5 5600/i5 12400 performance RTX 2080 = pretty much RTX 4060/RX 7600 performance Looks pretty good. But then nobody's come back from Kuttenberg quite yet and live to tell! Which is the place of a thousand NPCs...
-
It's only for the crazy popular games....... like Veilguard. They wanted to do one for BG3 apparently, but would have needed review keys way earlier... You can only offer guidance for games you actually got to finish, after all. GameStar Sonderhefte | Hefte | GameStar Shop Speaking of which, Warhorse seem pretty confident giving it all out early. A Czech tech site immediately put performance to the test. Not by firing up the nearby nuclear power plant to fuel a Ryzen X3D with two RTX 4090 in SLI mode. But by installing it on a quad core Ryzen, 16GB, GTX 1070. Which is, the same PC you may have played KCD I on in 2018. And it was a "shock" to them (the original still isn't the most optimized game, at least some areas can tank hard even on modern machines on ultra settings). Anyway, KCDII apparently ran in 30-50 fps, medium to high details, 1440P on that machine. Also: Steam Deck'd! Kingdom Come: Deliverance 2 Final Preview - Steam Deck HQ I think if KCD doesn't bust its launch, it will become quite big. Currently laughing at all the duds getting triggerred by the "gay stuff" that's supposedly in KCDII. Must have never played the original. Or made a format C to their memory sticks. After all, clearly gay Istvan Toth slapped their sweet little butts whilst they were rope-tied in this one. Amongst a few other things.
-
Germany's GameStar is gonna publish an extra issue just for KCD II. Not even Swen Vincke managed to get one for BG3. DA HYPE IS ON. Wait, there's more. Not only will it contain the usual quest guides, a walkthrough, hints for making money quickly (HELL YES), maps and a poster. It's gonna ship with a Papercraft Kit for your very own Trosky Castle! The only caveat (marked RED): It's not gonna be one for living and taking shelter in.
-
Blockbuster gaming doesn't really seem to have a target audience -- which is part of the problem. It's "one size fits all", which is why I'm glad that there's been a resurgence of more specialized games more lately. Stalker 2, Kingdom Come, From Soft -- even Larian didn't bend over and suddenly did an entirelly different game just because they balooned their budget. They're all doingn fine as well. Better than most "one size fits all" product. Often times though, these blockbusters literally aren't worth it (at least when approached with the rational parts of your brain...). You pay a higher price for the product itself in advance. Then you need better hardware to be able to at all run it. Then that hardware consumes a lot more power, as the latest tech is always the most taxing -- not that people living on cheap electricity in say the US would care about that specifically, but surely about the heat produced during Summer. Either way: I'm still enjoying Indy for instance. Do I regret buying it? No! But do I have a superior experience with it than with say, Drova? Desperados 3? Aliens: Dark Descent? Nah. tl;dr: Then what are you always upgrading your machine for? Showing off the new gadgets to the missus? My house, my car, my GeForce RTX. PS: 22 days to Bohemia. It seems Berlin's going nuts already.
-
The moment you realize it's a battle you can't win is the moment you're gonna slow down. It just never stops. You buy shiny new thing, the morning after shiny new thing is old thing. I had four (4) computers in between 1987 and 1994: An Amstrad CPC, a Commodore 64, an Amiga 500 and a 386DX40 PC. Oh, and of course a Gameboy on top of that. As argued, that kinda lasted 'til mid 2000s-ish... and once I HAD slowed down, I realized there's nothing hugely much to miss really. If there ever was a game I couldn't play on release, game's not going away. Speaking of which, first benches for Kingdom Come II are out. Alongside to the game running decently on Steam Deck they paint the picture of a better optimized release. Gamestar claims the game would be buttery smooth on their RTX 3080 tested, and pretty stable on Xbox as well. Of course, people sure haven't reached Rattay, er, Kuttenberg yet. Kingdom Come 2 benches, final release still pending: Kingdom Come: Deliverance II preview - PC performance graphics benchmarks of Graphics Cards and Processors | Action / FPS / TPS | TEST GPU As a comparison: Kingdom Come benches, end of 2018 (already patched some): Kingdom Come Deliverance - PC performance graphics benchmarks of Graphics Cards and Processors 2018 | RPG/Role playing | TEST GPU All ultra details. So RTX 3060 / 1060 should be fine on medium / low as officially claimed, accordingly. I played KCD I on a GTX 1050 Ti, which benched with 20-25 fps on ultra in KCD I (45-60fpsish on medium details). Maxed out you can still tank modern PCs with that (for increasingly minimal image quality gain). tl; dr: I AM READY.
-
You know, in other branches of electronics, devices becoming SMALLER is actually a sign of progress. Not the vice versa... Hercules Graphics Card - Wikipedia Really not sure if "Ever more PS" and brute forcing against clearly physical limitations is the solution here for all eternity, in general. Gotta be a reason why even cards that have the power of 2016 entry level GPUs can't be sold for anything less than 100$+ Dollars too. That's as if back in 2006 a 3dfx Voodoo1 was still on display for actual money -- 2006 was the year of TES Horse Armor DLC, Neverwinter Nights 2 and GeForce 7 series, just in case nobody remembers.
-
I used to be the same. But considering that GPU manufacturing seems to be hitting a wall physically, I' couldn't care less at this point. I'm not interested in even my usual entry level graphics card being as large as a battleship, costing as much as a PC used to cost, drawing like 200W+ and heating the room up in summer, just for playing a bloody video game. Personally I've never been about pixel perfection though, e.g. never obsessed about less than perfect textures or what Arkane's founder calls "making sure the eyes are perfect and the sun shines the right way." Unless it's something truly instrusive going on, during gameplay, you eventually don't notice. Nor care. Your mileage may differ. Let's see how this actually turns out and develops. Any technology that eventually may help to get out of this race with ever diminishing returns is a win in my book though. And if there's been a component market that's been SCREAMING diminishing returns, it's GPUs. Sure, GPUs have become infinitely more complex, with gazillions of Transistors now running the show. Still funny how it's been x86 CPUs that's been declared dead for decades, when a Ryzen can be bought for the exact same price tag as an Athlon 64 20 years ago. In fact, despite popular demand, the 9800X3d is cheaper than the FX-55 ever used to be.
-
I dropped out of that at the early 2000s already tbh. Prior I changed GPUs and CPUs like Al Bundy never changed his pants. Since then, well let's say that in the past twenty years, I'd spend less than 1,000 EUR on GPUs. I've been a mid settings gamer since (and have stopped played the majority of blockbuster games anyway). Indy is nice on high to max even on the RTX 3060 though -- with Pathtracing disabled, naturally. But hey, Nvidia have just promised the 5070 would be able to render as many frames as the 4090* (*with DLSS4). So who knows what the 5060 is capable of.
-
I picked up the RTX 3060 as a replacement for my old 1050 Ti. So fingers crossed. Really odd that they paired a lower card with more VRAM than the 3080. But I think they only did this because they wanted to upgrade from the 6GB of the RTX 2060 prior -- but only had the option of adding another 6GB VRAM due to memory bus / controller limitations. Btw, Indy runs fine with fewer than 12 gigs. It has the same "issue" as Wolfenstein (also id Tech): It has a texture pool setting. That's basically a cache as opposed to a direct detail setting. In other words, just because you cannot max out the setting, doesn't mean all textures are gonna look like turd. Naturally, you cannot max this cache out on low VRAM cards. Germany's PC Games Hardware tested this with Wolfenstein Youngblood way back already -- if the setting was too high, the fps took a huge nose dive. But usually, all benchmarks are run with maxed out settings (industry wide)....
-
So, a month 'til Avowed. Still kinda Unwowed. But still curious. I actually get the impression that there's not much hype in behind the scenes as well, neither at MS nor Obs. Like a product that nobody actually believes in may hit it big. Dunno, that's what this feels like. So little promotion, buzz and fanfare. Taking a look at the requirements and gameplay, the recommended RTX 3080 / 6800 XT seems pretty steep. For The Outer Worlds it was still but a GTX 1060 (and very playable on a 1050ti still). But then it's always kinda hard to guess looking at visuals these days how a game actually performs. I don't mean that only in terms of optimization or anything. Every however small step towards pixel perfection takes a good chunk of additional ressources. I'm sure some of you have engaged in the "Is Raytracing worth the performance hit?" in discussions before, as an example of that. Fingers crossed that Warhorse didn't lie about an RTX 3060 still being fine for Kingdom Come in FHD (even 60fps, medium details, without any upscaling). Kinda crazy it's been over two years already since Obsidian's last proper release: Pentiment. Really interesting game that was.
-
Well, that's what they're specifically advertising. AM5 (launched in 2022) is intended to see new CPUs until 2027. AM4 launched in 2016 and still saw new CPUs until 2024... it's quite something going into my mobo's compatibilty list and scrolling through all these chips. Not sure if AM5 is going to be the same. But they obviously did this as they saw AM4 as the entry level platform as to Ryzen when AM5 launched -- AM5 still being a tad more expensive to this day. There hasn't been a socket with this much longevity since (Super) Socket 7. Of course, if a platform lasts that long, there's going to be standards it won't support down the line. My board for instance is still PCIe 3.0. Also, initially, earlier boards didn't support X3D.
-
Mind you, I'm not overly impacted, as I'm not playing all that many AAA games. I'm still far more outraged over GPU prices these days. Just ten years ago the thought of ever spending 300 Euro on but a GPU never crossed my mind. Nowadays, that's the entry level price of admission. *Fun fact: Even the 60 bucks HD 6670 could play you some Skyrim (in reduced / medium details) and Dishonored. On the plus side: As there hasn't been a decent sub 200€ card in years, the used market even works for completely outdated GPUs almost a decade old -- and generally, GPUs don't like half their worth in just a year or two. E.g. you still get some money back eventually.
-
Indiana Jones just got an update. Not sure if I had this before.... but does anybody else get a red CPU load shown in the game's performance analysis just when enabling Vsync? I noted that this would tank my fps in some areas at the start of the game. It seems generally, there's a 20% performance hit with Vsync, be it with DLSS/Quality, TLAA... Vsync off (CPU load all green) Vsync on (the bar goes immediately up). This is regardless whether I activate Vsync in-game or via Nvidia driver settings.
-
Wanted to edit but accidentally quoted myself and double posted.
-
Well, that's general market perception at this point. E.g. The go-to option for Average Joe is Nvidia at this point. I think RDNA 2 (6000 series) would have done better if it wouldn't have been for Covid / mining crisis... AMD Vows to Resolve GPU Shortage: We Are Ramping Production | Tom's Hardware Though I think it's odd they paired the RX 6600 directly against the RTX 3060/12GB, it could have done good business where it counts as to market penetration.... which is the entry levels. And the rest was no slouch either. The current RX 7600 is just all around meh. As is the 4060, mind. Curious what AMD is going to announce next month. Pretty sure that Nvidia is going to continue as usual here, which is trying to upsell the heck out of you. "You want a card with more than 8GB? In 2025? With our new features introduced eating up VRAM too? Well, we've got these bigger models here for ya... "
-
Not true though, is it? Ati at one point was even in the lead, and even after AMD bought them, there were times when they had decent share. In the past 15 years, I had two AMD cards myself (HD 6670 and HD 6850), plus two Nvidia (GTX 1050ti plus RTX 3060). The HD 6xxxx was really good value (the 6850/6870 offering some of the best price to performance ratio on the market -- both could game anything high for 150-200 bucks). I think this is a long-term thing though. AMD have been trailing behind for so long (also technologically), that by now they're considered 2nd tier. When was the last time AMD introduced something themselves? Everything Nvidia does, AMD follows (upscaling, RT, Frame Generation...) Compare that to Ryzen, bringing affordable 6Core/12T CPUs to the masses at a time when Intel were still releasing quad cores. Yeah, Ryzen is where all the money went, but still. By now, Nvidia is the option to "go to". And ironically, the generally high prices even for entry level cards may actually benefit them. If you spend a couple hundred bucks anyway -- you're more likely to go with what's "safe", e.g. market leader. GPUs are oft used for a few years now too as well. That naturally applies to Intel as well. And even they can't just massively undercut current market prices. Manufacturing has become more expensive as well. Arc B580 is something. But compared to what say Kyro II did WAY back then, having a budget card sometimes perform on the level of cards that cost twice as much (Kyro II did have its weaknesses though)... The Card - STMicroelectronics Kyro II 64MB
-
Everybody be like: "Intel is throwing the budget GPU market a lifeline." Then I see this and want to hang myself. PS: Even the 40 Euro HD 3650, whilst struggling with most games, still produced over 70fps average at least in Unreal Tournament 3, 1280x1024 resolution (still quite common in 2008, and a 2007 game). Even if you would double those prices..... Srsly, Wot happened?
-
My GPU is up an running, undervolt is in place. So I've started Indy. The beginning is pure fan service of course, CHILLS. Seems a bit QTE'ish at the climax, but the more open hub levels are yet to come. That aside, game runs fine totally maxed out on a Ryzen 3 (2nd gen with 8 threads) and a RTX 3060/12GB. At least on Full HD. And obviously, without full pathracing. It's probably good that not every Steamdeck, Smartphone and potato PC is capable quite yet. Considering the additionally massive computing (and) power demands, WE'RE GONNA NEED A BIGGER NUCLEAR POWER PLANT.
-
AMD actually aren't that much better, funnily enough. At least over here, the 7600 XT is currently priced too high for what it is (basically a slightly improved RX 7600, just with more VRAM... an alternative RX 7600 /12GB would have been a way better option). And the only reason that older cards fall in that price range is because they're selling them off cheap... the 6750XT, if still available, goes for like half the price of which it started. Which was well over 600 Euro -- just a good two years ago! Computerbase.de ran an analysis on Nvidia a while ago: between the GTX 1060 and RTX 2060 there was still a big jump in performance. After that, it was but ~20% on average in between generations. What's more, the RTX 3060 until Arc was the only and first entry level card with 12GB -- oddly enough, given Nvidia's reputation to cheap out on VRAM. AMD didn't make any big jumps between 5600/XT / 6600 / 7600 either anymore. Meanwhile, the performance levels in the higher end are through the roof. It's a GPU class war truly now -- and Intel have arrived to throw the beggars some coin.
-
One of the biggest entry level hurdles when getting into the Witcher 3 -- even if you played prior games. It's only in the books that she was much of a character until that point. And then the entire base game narrative depends on chasing her. Every time you think you're close, well you aren't. If they hadn't introduced the perspective changes (you playing Ciri briefly), that would have been an even tougher sell. It's a bit like Super Mario Bros: "Sorry Geralt, but your princess is in another castle." Except probably even Mario had more characterization at the start to get you into the "mood" of actually chasing that pretty princess -- like two lines in the manual dedicated to her Peachness. And Mario also didn't need to chase her for dozens of hours. He could even use warp zones to make the process even shorter. Mario had it good. We had it good. If only we knew back then.