
Sven_
Members-
Posts
410 -
Joined
-
Last visited
-
Days Won
7
Content Type
Profiles
Forums
Blogs
Everything posted by Sven_
-
You know, in other branches of electronics, devices becoming SMALLER is actually a sign of progress. Not the vice versa... Hercules Graphics Card - Wikipedia Really not sure if "Ever more PS" and brute forcing against clearly physical limitations is the solution here for all eternity, in general. Gotta be a reason why even cards that have the power of 2016 entry level GPUs can't be sold for anything less than 100$+ Dollars too. That's as if back in 2006 a 3dfx Voodoo1 was still on display for actual money -- 2006 was the year of TES Horse Armor DLC, Neverwinter Nights 2 and GeForce 7 series, just in case nobody remembers.
-
I used to be the same. But considering that GPU manufacturing seems to be hitting a wall physically, I' couldn't care less at this point. I'm not interested in even my usual entry level graphics card being as large as a battleship, costing as much as a PC used to cost, drawing like 200W+ and heating the room up in summer, just for playing a bloody video game. Personally I've never been about pixel perfection though, e.g. never obsessed about less than perfect textures or what Arkane's founder calls "making sure the eyes are perfect and the sun shines the right way." Unless it's something truly instrusive going on, during gameplay, you eventually don't notice. Nor care. Your mileage may differ. Let's see how this actually turns out and develops. Any technology that eventually may help to get out of this race with ever diminishing returns is a win in my book though. And if there's been a component market that's been SCREAMING diminishing returns, it's GPUs. Sure, GPUs have become infinitely more complex, with gazillions of Transistors now running the show. Still funny how it's been x86 CPUs that's been declared dead for decades, when a Ryzen can be bought for the exact same price tag as an Athlon 64 20 years ago. In fact, despite popular demand, the 9800X3d is cheaper than the FX-55 ever used to be.
-
I dropped out of that at the early 2000s already tbh. Prior I changed GPUs and CPUs like Al Bundy never changed his pants. Since then, well let's say that in the past twenty years, I'd spend less than 1,000 EUR on GPUs. I've been a mid settings gamer since (and have stopped played the majority of blockbuster games anyway). Indy is nice on high to max even on the RTX 3060 though -- with Pathtracing disabled, naturally. But hey, Nvidia have just promised the 5070 would be able to render as many frames as the 4090* (*with DLSS4). So who knows what the 5060 is capable of.
-
I picked up the RTX 3060 as a replacement for my old 1050 Ti. So fingers crossed. Really odd that they paired a lower card with more VRAM than the 3080. But I think they only did this because they wanted to upgrade from the 6GB of the RTX 2060 prior -- but only had the option of adding another 6GB VRAM due to memory bus / controller limitations. Btw, Indy runs fine with fewer than 12 gigs. It has the same "issue" as Wolfenstein (also id Tech): It has a texture pool setting. That's basically a cache as opposed to a direct detail setting. In other words, just because you cannot max out the setting, doesn't mean all textures are gonna look like turd. Naturally, you cannot max this cache out on low VRAM cards. Germany's PC Games Hardware tested this with Wolfenstein Youngblood way back already -- if the setting was too high, the fps took a huge nose dive. But usually, all benchmarks are run with maxed out settings (industry wide)....
-
So, a month 'til Avowed. Still kinda Unwowed. But still curious. I actually get the impression that there's not much hype in behind the scenes as well, neither at MS nor Obs. Like a product that nobody actually believes in may hit it big. Dunno, that's what this feels like. So little promotion, buzz and fanfare. Taking a look at the requirements and gameplay, the recommended RTX 3080 / 6800 XT seems pretty steep. For The Outer Worlds it was still but a GTX 1060 (and very playable on a 1050ti still). But then it's always kinda hard to guess looking at visuals these days how a game actually performs. I don't mean that only in terms of optimization or anything. Every however small step towards pixel perfection takes a good chunk of additional ressources. I'm sure some of you have engaged in the "Is Raytracing worth the performance hit?" in discussions before, as an example of that. Fingers crossed that Warhorse didn't lie about an RTX 3060 still being fine for Kingdom Come in FHD (even 60fps, medium details, without any upscaling). Kinda crazy it's been over two years already since Obsidian's last proper release: Pentiment. Really interesting game that was.
-
Well, that's what they're specifically advertising. AM5 (launched in 2022) is intended to see new CPUs until 2027. AM4 launched in 2016 and still saw new CPUs until 2024... it's quite something going into my mobo's compatibilty list and scrolling through all these chips. Not sure if AM5 is going to be the same. But they obviously did this as they saw AM4 as the entry level platform as to Ryzen when AM5 launched -- AM5 still being a tad more expensive to this day. There hasn't been a socket with this much longevity since (Super) Socket 7. Of course, if a platform lasts that long, there's going to be standards it won't support down the line. My board for instance is still PCIe 3.0. Also, initially, earlier boards didn't support X3D.
-
Mind you, I'm not overly impacted, as I'm not playing all that many AAA games. I'm still far more outraged over GPU prices these days. Just ten years ago the thought of ever spending 300 Euro on but a GPU never crossed my mind. Nowadays, that's the entry level price of admission. *Fun fact: Even the 60 bucks HD 6670 could play you some Skyrim (in reduced / medium details) and Dishonored. On the plus side: As there hasn't been a decent sub 200€ card in years, the used market even works for completely outdated GPUs almost a decade old -- and generally, GPUs don't like half their worth in just a year or two. E.g. you still get some money back eventually.
-
Indiana Jones just got an update. Not sure if I had this before.... but does anybody else get a red CPU load shown in the game's performance analysis just when enabling Vsync? I noted that this would tank my fps in some areas at the start of the game. It seems generally, there's a 20% performance hit with Vsync, be it with DLSS/Quality, TLAA... Vsync off (CPU load all green) Vsync on (the bar goes immediately up). This is regardless whether I activate Vsync in-game or via Nvidia driver settings.
-
Wanted to edit but accidentally quoted myself and double posted.
-
Well, that's general market perception at this point. E.g. The go-to option for Average Joe is Nvidia at this point. I think RDNA 2 (6000 series) would have done better if it wouldn't have been for Covid / mining crisis... AMD Vows to Resolve GPU Shortage: We Are Ramping Production | Tom's Hardware Though I think it's odd they paired the RX 6600 directly against the RTX 3060/12GB, it could have done good business where it counts as to market penetration.... which is the entry levels. And the rest was no slouch either. The current RX 7600 is just all around meh. As is the 4060, mind. Curious what AMD is going to announce next month. Pretty sure that Nvidia is going to continue as usual here, which is trying to upsell the heck out of you. "You want a card with more than 8GB? In 2025? With our new features introduced eating up VRAM too? Well, we've got these bigger models here for ya... "
-
Not true though, is it? Ati at one point was even in the lead, and even after AMD bought them, there were times when they had decent share. In the past 15 years, I had two AMD cards myself (HD 6670 and HD 6850), plus two Nvidia (GTX 1050ti plus RTX 3060). The HD 6xxxx was really good value (the 6850/6870 offering some of the best price to performance ratio on the market -- both could game anything high for 150-200 bucks). I think this is a long-term thing though. AMD have been trailing behind for so long (also technologically), that by now they're considered 2nd tier. When was the last time AMD introduced something themselves? Everything Nvidia does, AMD follows (upscaling, RT, Frame Generation...) Compare that to Ryzen, bringing affordable 6Core/12T CPUs to the masses at a time when Intel were still releasing quad cores. Yeah, Ryzen is where all the money went, but still. By now, Nvidia is the option to "go to". And ironically, the generally high prices even for entry level cards may actually benefit them. If you spend a couple hundred bucks anyway -- you're more likely to go with what's "safe", e.g. market leader. GPUs are oft used for a few years now too as well. That naturally applies to Intel as well. And even they can't just massively undercut current market prices. Manufacturing has become more expensive as well. Arc B580 is something. But compared to what say Kyro II did WAY back then, having a budget card sometimes perform on the level of cards that cost twice as much (Kyro II did have its weaknesses though)... The Card - STMicroelectronics Kyro II 64MB
-
Everybody be like: "Intel is throwing the budget GPU market a lifeline." Then I see this and want to hang myself. PS: Even the 40 Euro HD 3650, whilst struggling with most games, still produced over 70fps average at least in Unreal Tournament 3, 1280x1024 resolution (still quite common in 2008, and a 2007 game). Even if you would double those prices..... Srsly, Wot happened?
-
My GPU is up an running, undervolt is in place. So I've started Indy. The beginning is pure fan service of course, CHILLS. Seems a bit QTE'ish at the climax, but the more open hub levels are yet to come. That aside, game runs fine totally maxed out on a Ryzen 3 (2nd gen with 8 threads) and a RTX 3060/12GB. At least on Full HD. And obviously, without full pathracing. It's probably good that not every Steamdeck, Smartphone and potato PC is capable quite yet. Considering the additionally massive computing (and) power demands, WE'RE GONNA NEED A BIGGER NUCLEAR POWER PLANT.
-
AMD actually aren't that much better, funnily enough. At least over here, the 7600 XT is currently priced too high for what it is (basically a slightly improved RX 7600, just with more VRAM... an alternative RX 7600 /12GB would have been a way better option). And the only reason that older cards fall in that price range is because they're selling them off cheap... the 6750XT, if still available, goes for like half the price of which it started. Which was well over 600 Euro -- just a good two years ago! Computerbase.de ran an analysis on Nvidia a while ago: between the GTX 1060 and RTX 2060 there was still a big jump in performance. After that, it was but ~20% on average in between generations. What's more, the RTX 3060 until Arc was the only and first entry level card with 12GB -- oddly enough, given Nvidia's reputation to cheap out on VRAM. AMD didn't make any big jumps between 5600/XT / 6600 / 7600 either anymore. Meanwhile, the performance levels in the higher end are through the roof. It's a GPU class war truly now -- and Intel have arrived to throw the beggars some coin.
-
One of the biggest entry level hurdles when getting into the Witcher 3 -- even if you played prior games. It's only in the books that she was much of a character until that point. And then the entire base game narrative depends on chasing her. Every time you think you're close, well you aren't. If they hadn't introduced the perspective changes (you playing Ciri briefly), that would have been an even tougher sell. It's a bit like Super Mario Bros: "Sorry Geralt, but your princess is in another castle." Except probably even Mario had more characterization at the start to get you into the "mood" of actually chasing that pretty princess -- like two lines in the manual dedicated to her Peachness. And Mario also didn't need to chase her for dozens of hours. He could even use warp zones to make the process even shorter. Mario had it good. We had it good. If only we knew back then.
-
Reminds me... The reason I bought Elden Ring back then wasn't primarily difficult combat also. But finally being able to experience a triple-A Open World game again that DOES NOT FEAR I'D GET LOST IN MY OWN BATHROOM WITHOUT A TOUR GUIDE. Wait, that smiley is wrong. This bugs me way too much. It's great that games are made for everyone. That does not mean that every game has to offer something for everybody, though. And if your experience is about open exploration, YOU BETTER LET PEOPLE ACTUALLY EXPLORE. Else it will be a worse experience for everyone. Daily rant over.
-
STRONK WIMMIN DETECTED. I'm also worried about Indy as apparently there is a female going to address me without PRIOR ADMITTANCE!!!!1
-
As said, biggest cancer in gaming: Nintendo trusting its audience more than even M-rated games by other studios out there (not saying that Indy is M-rated). Clearly there is something very off in what happens during playtesting already -- or at least things have become far too severe. I remember Josh Sawyer posting horror stories about that here too... even from back then. Like people not even buffing before fights in Icewind Dale, thus complaining fights were too hard. I mean, I was a total D&D newb back. But no buffs, really???? The problem of course isn't the feedback. But what's then done with it. I'm going into Indy with the "blockbuster" mindset -- to be fair, Indy has always been popcorn even at the cinema. Just the very best of it. It's ironic that back as a kid I used to play games that seemed to treat me like an adult -- whereas as an adult, I face many games that treat me like a kid! But yeah, in today's climate... I think this interview says a lot about it. “It was a long process to figure out exactly what puzzles should and shouldn’t be in the game,” Andersson continues. “For a big adventure game like this, there often aren’t many good references for how to integrate interesting and challenging puzzles. We wanted a bigger challenge than you normally see in big action-adventure games out there, something where you actually have to think, while still being accessible to a ton of people and making sure that the player never gets stuck. Mechanisms Become Mechanics: Inventing Puzzles for Indiana Jones and the Great Circle - Xbox Wire "Bigger challenge than what you normally see" has become rather relative.
-
Oh yeah, lots of comparisons with Dishonored (and even Thief...) already out there in terms of maps / hub areas. Completely triggers me #fanboy. Do you guys still have a retail release of the game in shops (PC)? Seems weird, as I still have all the Arkane games boxed. Have Bethesda stopped doing that? The RTX 3060/12 is naturally already on the lower end. Plus, in spring the market for sure changes (Battlemage to hit next week with 12GB, new AMD+ Nvidia GPUs). But I looked at what games actually still come out in 2025, and I figured I go with it. I'm fine with 1080p/med details. Outside of Avowed, Stalker 2, Kingdom Come II and Indy, nothing blockbuster gaming interests me much for now. Maybe Arkane's Blade. If Fallen Aces were already finished (there's a beefy first episode), that'd probably be my GOTY, indie and AA is eating good. Can still sell the card and upgrade for something better eventually. Btw, Arc B580 is said to have performance a cut above the RTX 4060, which combined with 12GB and the 249$ MSRP may make it actually a decent pick. In Germany the first models are listed 320 Euro and up though, and at that price point, it would compete with RX 7600 XT (16GB), RX 6750XT/12GB (what's left of it in shops), so it may not be a complete no-brainer now to jump Arc on a budget all of a sudden. But reviews are out today, we'll see.
-
Nvidia bundles the game with a GPU currently, only from RTX 4070 upwards though. The RTX 4070s are of course, "next". All of them have 12GB VRAM. On the plus side if you ever need to swap, Nvidia cards seem mostly pretty stable in prices, which influences the market for used cards and the money you can sell for. The RTX 3050 and 3060 are still selling NEW very close to their MSRP. The RTX 3070, if still available, seems above that actually. Used RTX 3070 seem to be going for ~300 Euro on Ebay on average (MSRP way back in 2020, four years ago, was 499 Euro). Aside of Indy, that's another reason why I casually think about buying RTX 4070. Gully aware that it is a scheme of Nvidia trying to upsell me, as below there's nothing decent available with more than 8GB VRAM else. An RTX 4060ti /16GB is far too expensive for what it is. An RTX 4060/12 they didn't even release, unlike back with RTX 3060. AMD's RX 6750XT 12/GB is selling for half of what it was two years ago (from over 600 quid down to just over 300), but I don't like its size and power draw. The RX 7600 XT is also too expensive for what it is. edit: Just ordered the cheapest available RTX 3060/12GB. I'm on Full HD -- and Warhorse better not be lying with this. (Official requirements claim the card is still good for FHD / 60fps / medium, which is roughly how I've played KCDI. Kingdom Come Deliverance 2 Runs 'Buttery Smooth' on RTX 3050, Won't Have Denuvo on PC
-
Yeah, it's in Euro, sorry! Prices have actually gone up a bit, just in time for Christmas. I was looking for them on a German site for deals / price comparison in Germany (the most common ones are probably Geizhals.de (that's literally cheapskate) and Idealo.de). Oh, forgot to say: The Indy Nvidia deal is only for RTX 4070 cards and up! The RTX 3060 costs ~280 Euro, the game 70, so that's 350 TOTAL. The 540 for the RTX 4070 would be the regular price for the card plus the game bundled with it. @Humanoid I was actually under the impression that the term "bucks" would be used anywhere, e.g. just common English language slang (British, American, whathaveyou). Can't tell naturally, as I'm not a native speaker. Picked it up someplace and rolled with it. May actually have been here, dunno! And if it wasn't, it must have been from Al Bundy (Married With Children legend and TV's last ever American action hero).
-
RTX 3060 / 12GB + Indy = 350 bucks RTX 4070 / 12 GB + Indy = 540 bucks (Indy included for free in a special Nvidia offer)... Then again, the last GPU I spent more than 400 bucks on (Deutsche Mark back then) was my first ever 3d accelerator card, a MiroHiscore 3dfx Voodoo1 with a whopping 6 megs of VRAM. That think cut through Baldur's Gate as nicely as Sarevok cut through Gorion (not that BG shipped with any 3d features to show).