-
Posts
2072 -
Joined
-
Last visited
-
Days Won
76
Content Type
Profiles
Forums
Blogs
Everything posted by majestic
-
Simon the Westie, April 1st, 2010 - November 3rd, 2024 Good night, sweet prince. You brought joy to everyone who knew you. On the one hand I am heartbroken, on the other I am happy that he did not have to suffer. Just around midnight he got up for the last time, wagged his tail, and went back to sleep. This morning we noticed that he didn't breathe any more. Being able to die peacefully in his sleep is a blessing we never expected him to have after he was diagnosed with lymphoma six months ago. The vet gave him three to six weeks without treatment, and he went on to enjoy his life for six more months. He put up a valiant fight, and now he gets to rest and hopefully join my mother in law who used to watch over him while we were working when he was a puppy and during his adolescence. Every now and then I catch myself wondering if we shouldn't have taken the vet up on the offer of chemotherapy, but then, he was 14 already, he hated going to the vet to the point where he tried to run away (which he never ever did otherwise) and it would have been a three month treatment, with no chance of actually curing him. Ideally it would have put his cancer into remission for a while. We'd trade three months of him being miserable with three months extra time with him? No, and looking back, I can't imagine to have made a different choice. You will be missed my sweet doggie. I know that every pet owner says this, but he was special and the cutest and the best.
-
The absolutely most important thing to change in Windows 11 is the stupid new context menu on right clicks. Nothing else comes even close to being this essential. Whoever had that idea over at Microsoft should be shot to the moon without a return ticket, along with the other criminals deserving of such a fate, like Putin, Xi and Trump.
-
Also, for anyone interested in the Thiry Years War, might I point you to SandRhoman, a duo of Swiss historians doing pretty interesting videos on historical subjects? The only caveat I have is their English narrator who is, well, not a real joy to listen to. Alas. No point in posting their videos in German.
-
Heh. "Es gibt nur einen Gott, das müsst ihr glauben, und wers nicht glauben kann, den zünd' ich einfach an." So, basically "There is only one God, you have to believe that, and if you can't believe it, I'll just set you on fire." (obviously the German lyrics rhyme ) to the tune of the Pippi Longstockings intro. Had to think of that video reading the thread now.
-
Does the first game have proper horse testicle physics as well?
-
I found some elves. There's a token black elf and two asian elves, one of which is a companion. 'Tis so inclusive, it wokes me out of my immersion. To make matters worse, none of the women in the game are masturbatory material, at least so far, and Neve (one of the companions) almost sounds like that transsexual in Hogwart's: Wokacy, even though they have different voice actors. On the techincal side, turning on RT makes the game look and feel better, but it also introduces the typical glowy reflective surfaces of lesser implementations of ray traced reflections. They're not as bas as they were in Hogwart's Legacy though, so it is mostly fine. Initially, the game was excessively unstable, crashing to the desktop with an NVIDIA driver error. I changed a few things in Afterburner*, turned off the EA app overlay and use full screen mode and vsync with a 50% frame lock, i.e. the game now runs at 60 fps. Have been playing without crashes ever since. I don't know which one of these was the problem, but it is stable now. Otherwise the game's running fine with all the RT bells and whistles turned on, even without DLSS. The gameplay is a hybrid between Dragon Age: Inquisition and Darksiders, which is where I guess a lot of the dislike for the combat comes from. I'd say it is not a good fit for a Dragon Age game, but then again, what is the gameplay of a Dragon Age game really? DA:O had a more traditional RPG combat system where fiddling with the AI scripts for an hour or two was necessary to make most of the tedium go away, Dragon Age 2 had a silly click to win action combat system and Dragon Age: Inquisition was Mass Effect with melee weapons and an optional tactical mode that was by far and large completely useless. So, is it actually bad so far? No. However, there are two key elements to the game that can put off players if they actively dislike such things (but such is the case for every game). I mean, outside of the inclusive elements causing incel rage, like the companions not having Tali style thigh gaps or there being a token minority character at every turn. The gameplay is really rather action oriented, you're dodging across the screen, you have to - unless playing on Storymode - use minor combo attacks and ideally perfectly block attacks which you can use to counter attack, all the while you're managing your own abilities with mana costs and cooldowns. It is about as far removed from Dragon Age: Origins as you can possibly get. The party banter and companions might as well have been written by James Gunn or Joss Whedon. If you don't like quippy banter or actively dislike character interactions like those in Guardians of the Galaxy or Firefly, then don't bother. Just early impressions though. There was one sad moment for long time fans of the series, and that comes really early. Bianca dies. *I had a custom voltage/frequency curve set up. It was within spec, it just turned off the boost by capping the voltage and fequency at the stock spec clock of 2600MHZ. That should not have caused the instability, but who knows. Every other game, including Cyberpunk 2077, was rock solid with that curve setting so far, but eh, yeah. I can just switch back to it once I am done playing Veilguard.
-
Especially since you're playing on 4K, where the CPU is rarely ever the bottle neck. Even with the fastest gaming CPUs (i.e. the X3D CPUs from AMD) you'd benefit more from a GPU upgrade. It's a time honed tradition in PC building. Without an unlimited budget, save on the CPU so you can buy the next best tier of GPUs. Even with a 4090, you'll see zero difference in frame rates in 4K in most 3D games between the last three generations of CPUs (unless you have something like a really cheap four core CPU that can't keep up). That is why these videos do not test 4K resolutions anyway, and most also dropped 1440p testing, which was a thing in the past. Slap someting on the level of a 7700 (non-X) into your rig and you're probably good for years to come. It is different for productivity workloads, and you might run into other bottlenecks (PCIE 5.0 support, more PCIE lanes, etc., memory compatibility and availability, etc.) that might make you want to upgrade, but, yeah, well...
-
Changed my mind, I'll play The Veilguard once it unlocks this evening. Name the last Bioware game with really strong gameplay and interesting combat (or even game) mechanics you can come up with where the good parts were not the story, the characters or the presentation of the gameplay over the actual gameplay that simply added a layer of fun to the game because it is just too cool to shoot your enemies with lightning all the time, all the while the gameplay is actually really just "click on enemy until it dies" and "click on mine to disarm it". Not being facetious, really. Genuinely curious, honestly.
-
Huh. 10-20% performance gains in gaming using CU-DIMM DDR5 and not changing anything else. Also probably some e-core task scheduling issues at work here and there, with another couple % added by a e-core and cache overclock (which makes a lot of sense when the e-cores are working on game threads, as the e-cores have comparatively little L2 cache per core, sharing 4MB per cluster of 4). Except for Horizon Zero Dawn, that game has always been really rough on both Intel and nVidia hardware. That might also point towards why some German YouTube channels that I have watched, including der8auer's, showed so drastically different gaming performance (especially in Cyberpunk) from GN or HUB and Jay's. der8auer used CU-DIMMs for his testing. Still doesn't make Arrow Lake interesting, but it does show that Intel is having some teething issues with the new platform and their new tile designs. That is a rough launch, and for optimal performance you need different memory kits that are currently at a premium, and it makes little sense. Using CU-DIMMs shouldn't even make that much of a difference. Sure, they theoretically allow for higher memory bandwidth, but really, such a drastic performance increase with everything else equal? Something's wrong here. And since Jay also mentions it, the 9000X3D CPUs are going to top the charts soon anyway. Given his remarks I suspect the rumors are true and AMD actually really doubled the amount of cache for the 9900X3D and 9950X3D by slapping it on both CCDs. Otherwise, given the otherwise small performance gain of Zen 4 over Zen 5, his insinuations make no sense, unless the regular Zen 5 CPUs have a severe bottleneck and the 9800X3D also shows massive performance gains by simply having more cache to work with. Well, the specs say there's less of clock speed regression on the 9800X3D compared to the 9700X than there was between the 7800X3D and the 7700X, but that should not be enough to make all these charts look small.
-
You could buy some Bud Light after the incel backlash against it (with a lot of them hate buying the stuff to destroy it on video, which is what I think @Bartimaeus was getting at ), but that is hard to recommend doing so even to make a point. Buying a Bioware game might still get you a product that is somewhat good, particularily if you're like me and neither hated Andromeda nor Inquisition, but there's not much that Bud Light has going for it. Well, maybe for a teetotaler it could be interesting, as it can't be really called beer.
-
Did Bioware turn everyone into transsexuals again? I mean that clearly was what made Inquisition bad. Male skull shapes on female characters. You know what's hilarious though? Angry orange haired pick-me girls and blue haired anime character channels complaining about Veilguard make me want to preorder the Deluxe Edition. On the EA app. *clicks* And done! Probably not even going to play it. *snort*
-
That reminds me of the good old times when my mother told me she bought seeds for hot peppers, because I like them. When they were all nice and grown, I bit into one to see how hot they really are, because normally what's being sold as hot peppers here tops out at Bird's Eye. Yeah, so it turns out she bought Naga Morich seeds. That was certainly an experience. One I don't necesssarily recommend.
-
What are you Playing Now? - Right Now at the moment edition
majestic replied to melkathi's topic in Computer and Console
https://www.smbgames.be/super-mario-crossover.php Pretty good idea, worth a playthrough just for the heck of it. -
What are you Playing Now? - Right Now at the moment edition
majestic replied to melkathi's topic in Computer and Console
Completed all of the season journey objectives and the Dark Citadel on Torment IV, which means I am done for the season, I guess, unless Blizzard adds something. They are adding "Meat or Treat" with The Butcher for Halloween, so I'll check that out. I'll probably also occasionally check if Blizzard managed to fix the Tenets of Akarat, silly little puzzles strewn through the expansion area that often bug out and cannot be completed. The first wing of the Dark Citadel was by far the hardest as the wing final boss is both a gear check (given the damage Spiritborn do, more like a defensive stat check, as I found out to my detriment when the unavoidable damage instantly killed me) and requires some coordination between the party members in different zones and you need one group to activate a portal for the other once they're done. All the other bosses have more involved mechanics, but you're always in the same screen with your party members. Well, maybe not at the last wing's final boss where you can miss a portal, but that respawns really quickly. The in-game timer tells me I have played for 86h, which is fine. It felt a little more grindy than the other seasons thanks to the Infernal Hordes being basically the best at any farming activity outside of levelling your glyphs (and that is just because you can't level glyphs in hordes) or getting runes, but some of that time is playing the expansion "story" which I won't have to any more from this point forward. Luckily. The Diablo games never had intricate storylines, but they used to be interesting and well written, and ever since Diablo 3, they're just neither. Although that can be said for every game Blizzard made since 2009. Wrath of the Lich King was the last time a story told in a Blizzard game was interesting and they actually cared for world building, and it is such a pity that the story of Warcraft 3 was completed in an MMORPG. The wording implies that Blizzard will be making more of these group only dungeons. Can't say I am looking forward to that, but maybe the next one is going to be more interesting. I also hope Blizzard finds some way of making them challenging without overly relying on either timed instant death mechanics where you need to complete an objective before the timer runs out or spamming the room with instant death effects. Torment IV can also use a wee bit of rebalancing. Having enemies deal so much damage that you die almost instantly is fine when that damage can be reasonably avoided. It is not fine when the entire screen is full of player effects that overlap and make it impossible to see the attacks happening, or cause so much lag that you can't see the enemies perform their actions. World bosses are particularily egregious, The Wandering Death does a giant laser instant death effect that plows through the screen and you're supposed to dodge it, but it was invisible due to massive server side lag. Most of the players near the boss died. I had a good laugh, but that can't be fun for Hardcore players. I guess it is understandable though, Blizzard is a newcomer at this online play business, they only have like three decades of experience, things like that can happen, right? -
So, what did I do today? I got up, grabbed a coffee, and checked my YT feed. This was in it: Now, let me translate this simple math problem for you. It reads: A child weighs 16kg (feel free to use half the weight of King George's belt, or whatever pounds are defined as, it is not relevant to the problem anyway) plus one quarter of its weight. How much does it weigh? Now, as the thumbnail suggests it is not a trick question, hence 16 is crossed out, and it is also not 20, because clearly it cannot be twenty. A quarter of 20 is 5, and 20 - 5 is very clearly not 16. Any of you wanna weigh in here? Yeah, I'm getting my coat and showing myself out. That alone would not be enough to post about it, but there's an untold number of people in the comments insisting that 20 is the correct answer. Let it never be said that the German education system is any better than the US'.
-
Intel has two primary motivations here, one is fighting off ARM's encroachment on their laptop market with efficiency gains, and the other is to be able to slap more cores onto workstation CPUs to take the wind out of AMD's Epyc. Still, it's a massive disappointment that the Lion Cove p-cores of Arrow Lake apparently can't even match the Raptor Cove p-cores. At this point in time it would have been better to slap Raptor Cove p-cores with a node shrink on Arrow Lake and call it a day. Point in case being the upcoming 288 core Sierra Forest (the 144 core variants are already available), although those still use the older Crestmont e-core architecture. Arrow Lake has the new Skymont e-cores that are largely the reason why the 285K can compete with the 9950X in heavily multithreaded workloads even though it has 8 threads less and the p-cores are, well, let's say, clearly not doing so well compared to the old Raptor Cove architecture. Skymont e-cores have roughly the same performance as Raptor Cove p-cores at the same clock speeds (they just clock lower, obviously), i.e. Intel's IPC gains on their e-cores are massive, I wouldn't be entirely suprised if future Intel CPUs are just going to ditch the p-cores because the e-cores are going to pass them in performance. The team developing the e-cores is clearly doing something right, and the other teams aren't. We might be looking at another Core 2 moment in a not very far off future.
-
Warning, language.
-
I don't think I've ever seen such a steep divide between synthetic benchmarks and real world application and gaming performance*. Cinebench, 3D Mark, Geekbench, whatever you pick, the Core Ultra 285K is either on par with AMD's best or dominates the charts, and when it comes to actually performing, it falls way short. Except for productivity workloads, but even there, how can so much single thread benchmark performance lead to such terrible Photoshop real world performance? Guess Intel does struggle a bit with glueing their CPUs together. Going to be interesting to see how Zen 6 will shape up, as AMD is also switching their way of glueing CPUs together. According to rumors, if all goes well, Nova Lake will come out with an additional cache tile. Intel will call that LLC (Last-Level-Cache) and is planned to basically be Intel's version of 3D V-Cache. Roadmapped for late 2026/early 2027. It'll be a while before prices for the X3D CPUs drop, there's just no incentive for AMD (or retailers) to do so. *nVidia cheating with driver side optimizations when synthetic benchmarks were detected nonwithstanding. That can't really apply here. I mean, I hope there's nothing in Arrow Lake's microcode that detects if 3DMark is running just to produce better performance. That would be weird, even for Intel.
-
With testing this time: Well, that is underwhelming. Guess that's one generation I'll be sitting out then. Especially since there are rumors now that the other LGA 1851 CPU generations have been scrapped. Pity, I was looking forward to Arrow Lake, but that gaming performance is just, uhm... not good, and I really don't need the productivity gains, and as Steve puts it, even with gaming as a full time job, which I don't do obviously, it will take years to get the price difference in with the lower power draw. edit: weird though, looking at the released benchmark scores on other reviews, the single thread and multithread performance of Arrow Lake in e.g. Cinebench outclasses everything by a more than decent margin, it just translates into no gains or even worse performance in gaming. Guess that makes Cinebench and other synthetic benchmarks either worthless, or something else is not quite right. Bizarre, at any rate. Especially that performance drop in Cyberpunk 2077, where it is slower than a freaking 12600K. edit2: In der8auer's German video, he gets completely different results from Hardware Unboxed with Cyberpunk, where the 285K is behind the 14900K, but still ahead of the 9950X (and obviously behind the 7800X3D, but that much was to be expected anyway).