Jump to content

Sven_

Members
  • Posts

    306
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by Sven_

  1. Well, that's what they're specifically advertising. AM5 (launched in 2022) is intended to see new CPUs until 2027. AM4 launched in 2016 and still saw new CPUs until 2024... it's quite something going into my mobo's compatibilty list and scrolling through all these chips. Not sure if AM5 is going to be the same. But they obviously did this as they saw AM4 as the entry level platform as to Ryzen when AM5 launched -- AM5 still being a tad more expensive to this day. There hasn't been a socket with this much longevity since (Super) Socket 7. Of course, if a platform lasts that long, there's going to be standards it won't support down the line. My board for instance is still PCIe 3.0. Also, initially, earlier boards didn't support X3D.
  2. Mind you, I'm not overly impacted, as I'm not playing all that many AAA games. I'm still far more outraged over GPU prices these days. Just ten years ago the thought of ever spending 300 Euro on but a GPU never crossed my mind. Nowadays, that's the entry level price of admission. *Fun fact: Even the 60 bucks HD 6670 could play you some Skyrim (in reduced / medium details) and Dishonored. On the plus side: As there hasn't been a decent sub 200€ card in years, the used market even works for completely outdated GPUs almost a decade old -- and generally, GPUs don't like half their worth in just a year or two. E.g. you still get some money back eventually.
  3. Well, I fixed my issue. I was dumb. I didn't have VRR (Variable Refresh Rate) enabled in Windows in all those years, lol. Also, I'm gonna be one of those who's apparently going to spent ten hours in the Vatican alone.
  4. Indiana Jones just got an update. Not sure if I had this before.... but does anybody else get a red CPU load shown in the game's performance analysis just when enabling Vsync? I noted that this would tank my fps in some areas at the start of the game. It seems generally, there's a 20% performance hit with Vsync, be it with DLSS/Quality, TLAA... Vsync off (CPU load all green) Vsync on (the bar goes immediately up). This is regardless whether I activate Vsync in-game or via Nvidia driver settings.
  5. Wanted to edit but accidentally quoted myself and double posted.
  6. Well, that's general market perception at this point. E.g. The go-to option for Average Joe is Nvidia at this point. I think RDNA 2 (6000 series) would have done better if it wouldn't have been for Covid / mining crisis... AMD Vows to Resolve GPU Shortage: We Are Ramping Production | Tom's Hardware Though I think it's odd they paired the RX 6600 directly against the RTX 3060/12GB, it could have done good business where it counts as to market penetration.... which is the entry levels. And the rest was no slouch either. The current RX 7600 is just all around meh. As is the 4060, mind. Curious what AMD is going to announce next month. Pretty sure that Nvidia is going to continue as usual here, which is trying to upsell the heck out of you. "You want a card with more than 8GB? In 2025? With our new features introduced eating up VRAM too? Well, we've got these bigger models here for ya... "
  7. Not true though, is it? Ati at one point was even in the lead, and even after AMD bought them, there were times when they had decent share. In the past 15 years, I had two AMD cards myself (HD 6670 and HD 6850), plus two Nvidia (GTX 1050ti plus RTX 3060). The HD 6xxxx was really good value (the 6850/6870 offering some of the best price to performance ratio on the market -- both could game anything high for 150-200 bucks). I think this is a long-term thing though. AMD have been trailing behind for so long (also technologically), that by now they're considered 2nd tier. When was the last time AMD introduced something themselves? Everything Nvidia does, AMD follows (upscaling, RT, Frame Generation...) Compare that to Ryzen, bringing affordable 6Core/12T CPUs to the masses at a time when Intel were still releasing quad cores. Yeah, Ryzen is where all the money went, but still. By now, Nvidia is the option to "go to". And ironically, the generally high prices even for entry level cards may actually benefit them. If you spend a couple hundred bucks anyway -- you're more likely to go with what's "safe", e.g. market leader. GPUs are oft used for a few years now too as well. That naturally applies to Intel as well. And even they can't just massively undercut current market prices. Manufacturing has become more expensive as well. Arc B580 is something. But compared to what say Kyro II did WAY back then, having a budget card sometimes perform on the level of cards that cost twice as much (Kyro II did have its weaknesses though)... The Card - STMicroelectronics Kyro II 64MB
  8. Everybody be like: "Intel is throwing the budget GPU market a lifeline." Then I see this and want to hang myself. PS: Even the 40 Euro HD 3650, whilst struggling with most games, still produced over 70fps average at least in Unreal Tournament 3, 1280x1024 resolution (still quite common in 2008, and a 2007 game). Even if you would double those prices..... Srsly, Wot happened?
  9. My GPU is up an running, undervolt is in place. So I've started Indy. The beginning is pure fan service of course, CHILLS. Seems a bit QTE'ish at the climax, but the more open hub levels are yet to come. That aside, game runs fine totally maxed out on a Ryzen 3 (2nd gen with 8 threads) and a RTX 3060/12GB. At least on Full HD. And obviously, without full pathracing. It's probably good that not every Steamdeck, Smartphone and potato PC is capable quite yet. Considering the additionally massive computing (and) power demands, WE'RE GONNA NEED A BIGGER NUCLEAR POWER PLANT.
  10. AMD actually aren't that much better, funnily enough. At least over here, the 7600 XT is currently priced too high for what it is (basically a slightly improved RX 7600, just with more VRAM... an alternative RX 7600 /12GB would have been a way better option). And the only reason that older cards fall in that price range is because they're selling them off cheap... the 6750XT, if still available, goes for like half the price of which it started. Which was well over 600 Euro -- just a good two years ago! Computerbase.de ran an analysis on Nvidia a while ago: between the GTX 1060 and RTX 2060 there was still a big jump in performance. After that, it was but ~20% on average in between generations. What's more, the RTX 3060 until Arc was the only and first entry level card with 12GB -- oddly enough, given Nvidia's reputation to cheap out on VRAM. AMD didn't make any big jumps between 5600/XT / 6600 / 7600 either anymore. Meanwhile, the performance levels in the higher end are through the roof. It's a GPU class war truly now -- and Intel have arrived to throw the beggars some coin.
  11. One of the biggest entry level hurdles when getting into the Witcher 3 -- even if you played prior games. It's only in the books that she was much of a character until that point. And then the entire base game narrative depends on chasing her. Every time you think you're close, well you aren't. If they hadn't introduced the perspective changes (you playing Ciri briefly), that would have been an even tougher sell. It's a bit like Super Mario Bros: "Sorry Geralt, but your princess is in another castle." Except probably even Mario had more characterization at the start to get you into the "mood" of actually chasing that pretty princess -- like two lines in the manual dedicated to her Peachness. And Mario also didn't need to chase her for dozens of hours. He could even use warp zones to make the process even shorter. Mario had it good. We had it good. If only we knew back then.
  12. Would be funny if it weren't so accurate. First thing in Witcher 3 you're gonna see be like: "Push x to open door." Well, and some female breasts laid bare. Truly mature game.
  13. Reminds me... The reason I bought Elden Ring back then wasn't primarily difficult combat also. But finally being able to experience a triple-A Open World game again that DOES NOT FEAR I'D GET LOST IN MY OWN BATHROOM WITHOUT A TOUR GUIDE. Wait, that smiley is wrong. This bugs me way too much. It's great that games are made for everyone. That does not mean that every game has to offer something for everybody, though. And if your experience is about open exploration, YOU BETTER LET PEOPLE ACTUALLY EXPLORE. Else it will be a worse experience for everyone. Daily rant over.
  14. @LadyCrimson Not sure how legit it is, but I'm gonna try with Indy -- my GPU should arrive today or tomorrow. https://steamcommunity.com/app/2677660/discussions/0/598512725180576448/
  15. STRONK WIMMIN DETECTED. I'm also worried about Indy as apparently there is a female going to address me without PRIOR ADMITTANCE!!!!1
  16. As said, biggest cancer in gaming: Nintendo trusting its audience more than even M-rated games by other studios out there (not saying that Indy is M-rated). Clearly there is something very off in what happens during playtesting already -- or at least things have become far too severe. I remember Josh Sawyer posting horror stories about that here too... even from back then. Like people not even buffing before fights in Icewind Dale, thus complaining fights were too hard. I mean, I was a total D&D newb back. But no buffs, really???? The problem of course isn't the feedback. But what's then done with it. I'm going into Indy with the "blockbuster" mindset -- to be fair, Indy has always been popcorn even at the cinema. Just the very best of it. It's ironic that back as a kid I used to play games that seemed to treat me like an adult -- whereas as an adult, I face many games that treat me like a kid! But yeah, in today's climate... I think this interview says a lot about it. “It was a long process to figure out exactly what puzzles should and shouldn’t be in the game,” Andersson continues. “For a big adventure game like this, there often aren’t many good references for how to integrate interesting and challenging puzzles. We wanted a bigger challenge than you normally see in big action-adventure games out there, something where you actually have to think, while still being accessible to a ton of people and making sure that the player never gets stuck. Mechanisms Become Mechanics: Inventing Puzzles for Indiana Jones and the Great Circle - Xbox Wire "Bigger challenge than what you normally see" has become rather relative.
  17. Oh yeah, lots of comparisons with Dishonored (and even Thief...) already out there in terms of maps / hub areas. Completely triggers me #fanboy. Do you guys still have a retail release of the game in shops (PC)? Seems weird, as I still have all the Arkane games boxed. Have Bethesda stopped doing that? The RTX 3060/12 is naturally already on the lower end. Plus, in spring the market for sure changes (Battlemage to hit next week with 12GB, new AMD+ Nvidia GPUs). But I looked at what games actually still come out in 2025, and I figured I go with it. I'm fine with 1080p/med details. Outside of Avowed, Stalker 2, Kingdom Come II and Indy, nothing blockbuster gaming interests me much for now. Maybe Arkane's Blade. If Fallen Aces were already finished (there's a beefy first episode), that'd probably be my GOTY, indie and AA is eating good. Can still sell the card and upgrade for something better eventually. Btw, Arc B580 is said to have performance a cut above the RTX 4060, which combined with 12GB and the 249$ MSRP may make it actually a decent pick. In Germany the first models are listed 320 Euro and up though, and at that price point, it would compete with RX 7600 XT (16GB), RX 6750XT/12GB (what's left of it in shops), so it may not be a complete no-brainer now to jump Arc on a budget all of a sudden. But reviews are out today, we'll see.
  18. Nvidia bundles the game with a GPU currently, only from RTX 4070 upwards though. The RTX 4070s are of course, "next". All of them have 12GB VRAM. On the plus side if you ever need to swap, Nvidia cards seem mostly pretty stable in prices, which influences the market for used cards and the money you can sell for. The RTX 3050 and 3060 are still selling NEW very close to their MSRP. The RTX 3070, if still available, seems above that actually. Used RTX 3070 seem to be going for ~300 Euro on Ebay on average (MSRP way back in 2020, four years ago, was 499 Euro). Aside of Indy, that's another reason why I casually think about buying RTX 4070. Gully aware that it is a scheme of Nvidia trying to upsell me, as below there's nothing decent available with more than 8GB VRAM else. An RTX 4060ti /16GB is far too expensive for what it is. An RTX 4060/12 they didn't even release, unlike back with RTX 3060. AMD's RX 6750XT 12/GB is selling for half of what it was two years ago (from over 600 quid down to just over 300), but I don't like its size and power draw. The RX 7600 XT is also too expensive for what it is. edit: Just ordered the cheapest available RTX 3060/12GB. I'm on Full HD -- and Warhorse better not be lying with this. (Official requirements claim the card is still good for FHD / 60fps / medium, which is roughly how I've played KCDI. Kingdom Come Deliverance 2 Runs 'Buttery Smooth' on RTX 3050, Won't Have Denuvo on PC
  19. Yeah, it's in Euro, sorry! Prices have actually gone up a bit, just in time for Christmas. I was looking for them on a German site for deals / price comparison in Germany (the most common ones are probably Geizhals.de (that's literally cheapskate) and Idealo.de). Oh, forgot to say: The Indy Nvidia deal is only for RTX 4070 cards and up! The RTX 3060 costs ~280 Euro, the game 70, so that's 350 TOTAL. The 540 for the RTX 4070 would be the regular price for the card plus the game bundled with it. @Humanoid I was actually under the impression that the term "bucks" would be used anywhere, e.g. just common English language slang (British, American, whathaveyou). Can't tell naturally, as I'm not a native speaker. Picked it up someplace and rolled with it. May actually have been here, dunno! And if it wasn't, it must have been from Al Bundy (Married With Children legend and TV's last ever American action hero).
  20. RTX 3060 / 12GB + Indy = 350 bucks RTX 4070 / 12 GB + Indy = 540 bucks (Indy included for free in a special Nvidia offer)... Then again, the last GPU I spent more than 400 bucks on (Deutsche Mark back then) was my first ever 3d accelerator card, a MiroHiscore 3dfx Voodoo1 with a whopping 6 megs of VRAM. That think cut through Baldur's Gate as nicely as Sarevok cut through Gorion (not that BG shipped with any 3d features to show).
  21. NOT YET, but I've decided to buy my first ever new GPU since 2017.... and play Indy as well. The Dishonored comparisons finally sold it to me... Also, the RT requirements may look steep -- until you realize that even a crap RTX 3050 applies (not gonna buy it, I'll go with a 12GB one). And even Ryzen 3 and Core i3 Cpus are fine. Now as to the puzzles, wish me luck. I'm a bit unsure, not expecting too much, just a few varied tasks. Games such as Kingdom Come, Stalker, Elden Ring even Zelda would never make it with one of the major Northern American industry players and publishers. They seem to have "perfected" the arts of playtesting truly. Was watching Gamespot's "The redesign that saved Deathloop" feature on YT and afterwards wanted to hang myself. Not only because of the feedback they got (some testers found the very idea of figuring things out to be... TEDIOUS!). But because it apparently lead Deathloop to be the numbing and repetitive "go here and do this" kind of game it's become -- including spoiling step by step the actual loop, which is basically the core idea of the game's entire concept... It's as if Hollywood were to screentest every single Nolan movie way in advance -- and react to that by cutting say Memento into chronological order so that everybody in the room would finally "get it" (including the dude who's not interested and plays on his smartphone). Biggest cancer in game biz, end of. How Playtesting can Make your Game Bland - Polydin
  22. Trying to catch a few previews and news of dat new Indy game, and thinking: You know, it's fine that apparently we get a WOKE GAME warning now on every single game years before release. (I fully expect Kingdom Come II to get flagged by at least some attention seeker and clickbaiter at some point as well). However, I'd rather be warned about DORK GAME. God Of War has puzzles your pet hamster won't get stuck on. The Witcher witcher senses itself and never leaves its tutorial. Generally, Nintendo asks more of its audience than M-rated Western blockbusters these days. The jury's still out on Indy, and Indy has always been brawns AND brains. But srsly, where's the warists when you actually need them? Maybe they're fine with themselves and their kids being conditioned to not THINK for a second before falling for all the ragebait on the internet, who knows. To end on something positive, I found something new about Kindcom Come II. And it's glorious.
  23. Still in Pripyat. Despite being on the 2nd big map already, I've only killed like 10 stalkers. All the kills else were mutants, mostly small dogs. The maps seem less detailed than in Shadow Of Chernobyl, but this makes the zone more believable as well as more creepy, of kinds. In particular if you're out at night. (I usually go to bed at night). In SoC, you walk to the left from the beginning area and aready run into a camp of banditd. To the right is the military. And generally, there seem humans out to get you everywhere.
  24. Yeah I did that (advanced settings as well, disc release). There are some micro freezes in busy areas that seem more to do with A-Life though, that is AI entering the fold and spawning. Plus there is one underground sequence so far where the fps drop (very briefly). So the game isn't completely without drops even on a 2019/2020ish computer (that's what my PC is roughly spec'd at, not a high end one though). Overall, the game plays fine. Also gave Call Of Pripyat a shot for the first ever time. Seems they went into the Stalker 2ish direction years ago already, with more open maps and all. There's also surprisingly few battles so far in that one, unless you actively engage every mutant roaming the zones. Considering that the introduction states there'd be just like 100-200 stalkers roaming the entire area, that seems more believable. Internet rumor has it that THQ pushed them to have more gunfights for Shadow Of Chernobyl back in 2007. Considering that was also the year Bioshock's devs were scared chicken of boring the FPS crowd if they didn't introduce more stuff to shoot in the face (sorry, German), I'm almost inclined to believe that rumour. Watching a couple Chernobyl docs these days too. Still remember when we had to wash our hands after playing outside in kindergarten (I was in my last kindergarten year in Germany, April 1986 was a couple months before I went to elementary school that summer). #allindazone
  25. There are too much containers in there. Then again: Anybody looking in all of those containers has been conditioned too much by all the awful loot trap kind of games -- starting with Diablo back then and its awful influence. Diablo being the gaming equivalent of a coin slot machine inside of a dirty pub itself: get back to it, and you're rewarded with something. Your brain can't resist it. The overall philosphy though is more in line with Ultima. Where containers exist for being containers. Like in a world, rather than in a thinly veiled coin slot machine. Thus in a kitchen you'd find nothing hugely useful, whereas in the hidden lab of a magician, you may. That kind of gets you thinking before brainlessly clicking on everything. At least, that's what Larian generally seems to be going for. Whether they succeed... The Outer Worlds was awful in that regard. In parts, the loot placement was logical, such as finding drinks etc. in an inn. But then randomly, you'd find guns on the chair next to that or on the toilet. It's as if the entire world had been crowded by AI. Generally, there's loot and containers ****ing EVERYWHERE. It also completely went against the game's fiction too, as you got absolutely showered in ammo and ressources. It's as if there was no thought put into it. Except your inner packrat being kept occupied. Generally, gaming needs less Diablo (cheap gratification trigger). And more Ultima (world sim). Good luck trying to hammer all that Diablo out of gamer brains after decades of conditioning though. Whilst this may sound as if I hated Diablo: I actually do not. In itself, it can be quite fun. I think it had an influence on games it shouldn't have had though. (Much like some popular MMOs later, but that's another topic).
×
×
  • Create New...