Jump to content

Sven_

Members
  • Posts

    306
  • Joined

  • Last visited

  • Days Won

    4

Sven_ last won the day on December 4

Sven_ had the most liked content!

Reputation

280 Excellent

About Sven_

  • Rank
    (3) Conjurer
    (3) Conjurer

Profile Information

  • Location
    Germany

Recent Profile Visitors

1720 profile views
  1. Well, that's what they're specifically advertising. AM5 (launched in 2022) is intended to see new CPUs until 2027. AM4 launched in 2016 and still saw new CPUs until 2024... it's quite something going into my mobo's compatibilty list and scrolling through all these chips. Not sure if AM5 is going to be the same. But they obviously did this as they saw AM4 as the entry level platform as to Ryzen when AM5 launched -- AM5 still being a tad more expensive to this day. There hasn't been a socket with this much longevity since (Super) Socket 7. Of course, if a platform lasts that long, there's going to be standards it won't support down the line. My board for instance is still PCIe 3.0. Also, initially, earlier boards didn't support X3D.
  2. Mind you, I'm not overly impacted, as I'm not playing all that many AAA games. I'm still far more outraged over GPU prices these days. Just ten years ago the thought of ever spending 300 Euro on but a GPU never crossed my mind. Nowadays, that's the entry level price of admission. *Fun fact: Even the 60 bucks HD 6670 could play you some Skyrim (in reduced / medium details) and Dishonored. On the plus side: As there hasn't been a decent sub 200€ card in years, the used market even works for completely outdated GPUs almost a decade old -- and generally, GPUs don't like half their worth in just a year or two. E.g. you still get some money back eventually.
  3. Well, I fixed my issue. I was dumb. I didn't have VRR (Variable Refresh Rate) enabled in Windows in all those years, lol. Also, I'm gonna be one of those who's apparently going to spent ten hours in the Vatican alone.
  4. Indiana Jones just got an update. Not sure if I had this before.... but does anybody else get a red CPU load shown in the game's performance analysis just when enabling Vsync? I noted that this would tank my fps in some areas at the start of the game. It seems generally, there's a 20% performance hit with Vsync, be it with DLSS/Quality, TLAA... Vsync off (CPU load all green) Vsync on (the bar goes immediately up). This is regardless whether I activate Vsync in-game or via Nvidia driver settings.
  5. Wanted to edit but accidentally quoted myself and double posted.
  6. Well, that's general market perception at this point. E.g. The go-to option for Average Joe is Nvidia at this point. I think RDNA 2 (6000 series) would have done better if it wouldn't have been for Covid / mining crisis... AMD Vows to Resolve GPU Shortage: We Are Ramping Production | Tom's Hardware Though I think it's odd they paired the RX 6600 directly against the RTX 3060/12GB, it could have done good business where it counts as to market penetration.... which is the entry levels. And the rest was no slouch either. The current RX 7600 is just all around meh. As is the 4060, mind. Curious what AMD is going to announce next month. Pretty sure that Nvidia is going to continue as usual here, which is trying to upsell the heck out of you. "You want a card with more than 8GB? In 2025? With our new features introduced eating up VRAM too? Well, we've got these bigger models here for ya... "
  7. Not true though, is it? Ati at one point was even in the lead, and even after AMD bought them, there were times when they had decent share. In the past 15 years, I had two AMD cards myself (HD 6670 and HD 6850), plus two Nvidia (GTX 1050ti plus RTX 3060). The HD 6xxxx was really good value (the 6850/6870 offering some of the best price to performance ratio on the market -- both could game anything high for 150-200 bucks). I think this is a long-term thing though. AMD have been trailing behind for so long (also technologically), that by now they're considered 2nd tier. When was the last time AMD introduced something themselves? Everything Nvidia does, AMD follows (upscaling, RT, Frame Generation...) Compare that to Ryzen, bringing affordable 6Core/12T CPUs to the masses at a time when Intel were still releasing quad cores. Yeah, Ryzen is where all the money went, but still. By now, Nvidia is the option to "go to". And ironically, the generally high prices even for entry level cards may actually benefit them. If you spend a couple hundred bucks anyway -- you're more likely to go with what's "safe", e.g. market leader. GPUs are oft used for a few years now too as well. That naturally applies to Intel as well. And even they can't just massively undercut current market prices. Manufacturing has become more expensive as well. Arc B580 is something. But compared to what say Kyro II did WAY back then, having a budget card sometimes perform on the level of cards that cost twice as much (Kyro II did have its weaknesses though)... The Card - STMicroelectronics Kyro II 64MB
  8. Everybody be like: "Intel is throwing the budget GPU market a lifeline." Then I see this and want to hang myself. PS: Even the 40 Euro HD 3650, whilst struggling with most games, still produced over 70fps average at least in Unreal Tournament 3, 1280x1024 resolution (still quite common in 2008, and a 2007 game). Even if you would double those prices..... Srsly, Wot happened?
  9. My GPU is up an running, undervolt is in place. So I've started Indy. The beginning is pure fan service of course, CHILLS. Seems a bit QTE'ish at the climax, but the more open hub levels are yet to come. That aside, game runs fine totally maxed out on a Ryzen 3 (2nd gen with 8 threads) and a RTX 3060/12GB. At least on Full HD. And obviously, without full pathracing. It's probably good that not every Steamdeck, Smartphone and potato PC is capable quite yet. Considering the additionally massive computing (and) power demands, WE'RE GONNA NEED A BIGGER NUCLEAR POWER PLANT.
  10. AMD actually aren't that much better, funnily enough. At least over here, the 7600 XT is currently priced too high for what it is (basically a slightly improved RX 7600, just with more VRAM... an alternative RX 7600 /12GB would have been a way better option). And the only reason that older cards fall in that price range is because they're selling them off cheap... the 6750XT, if still available, goes for like half the price of which it started. Which was well over 600 Euro -- just a good two years ago! Computerbase.de ran an analysis on Nvidia a while ago: between the GTX 1060 and RTX 2060 there was still a big jump in performance. After that, it was but ~20% on average in between generations. What's more, the RTX 3060 until Arc was the only and first entry level card with 12GB -- oddly enough, given Nvidia's reputation to cheap out on VRAM. AMD didn't make any big jumps between 5600/XT / 6600 / 7600 either anymore. Meanwhile, the performance levels in the higher end are through the roof. It's a GPU class war truly now -- and Intel have arrived to throw the beggars some coin.
  11. One of the biggest entry level hurdles when getting into the Witcher 3 -- even if you played prior games. It's only in the books that she was much of a character until that point. And then the entire base game narrative depends on chasing her. Every time you think you're close, well you aren't. If they hadn't introduced the perspective changes (you playing Ciri briefly), that would have been an even tougher sell. It's a bit like Super Mario Bros: "Sorry Geralt, but your princess is in another castle." Except probably even Mario had more characterization at the start to get you into the "mood" of actually chasing that pretty princess -- like two lines in the manual dedicated to her Peachness. And Mario also didn't need to chase her for dozens of hours. He could even use warp zones to make the process even shorter. Mario had it good. We had it good. If only we knew back then.
  12. Would be funny if it weren't so accurate. First thing in Witcher 3 you're gonna see be like: "Push x to open door." Well, and some female breasts laid bare. Truly mature game.
  13. Reminds me... The reason I bought Elden Ring back then wasn't primarily difficult combat also. But finally being able to experience a triple-A Open World game again that DOES NOT FEAR I'D GET LOST IN MY OWN BATHROOM WITHOUT A TOUR GUIDE. Wait, that smiley is wrong. This bugs me way too much. It's great that games are made for everyone. That does not mean that every game has to offer something for everybody, though. And if your experience is about open exploration, YOU BETTER LET PEOPLE ACTUALLY EXPLORE. Else it will be a worse experience for everyone. Daily rant over.
  14. @LadyCrimson Not sure how legit it is, but I'm gonna try with Indy -- my GPU should arrive today or tomorrow. https://steamcommunity.com/app/2677660/discussions/0/598512725180576448/
  15. STRONK WIMMIN DETECTED. I'm also worried about Indy as apparently there is a female going to address me without PRIOR ADMITTANCE!!!!1
×
×
  • Create New...