I can see how he may have gotten there... I mean, Oblivion's art design is obviously inspired by your typical European middle age kind of thing. Still, KCD is kinda the polar opposite of Oblivion in pretty much everything else. Not gonna read / watch reviews though, wanna go "blind".
Technical reports are interesting though. Gamestar claims a RTX 2060 Super has ~50ish fps averages on high preset in WHQD (no upscaling). With upscaling on DLSS/Quality over 70ish. PC Games Hardware took the most demanding scenes they found for GPU and CPU. In the GPU sequence, 1080P native, ultra details preset, an RTX 2060 (non-super) had 30 fps average, 1080 Ti 40. For the CPU, they took on Downtown Kuttenberg. A 8700K was still close to producing 60 fps average. An i7 6700K released in 2015 when Pillars Of Eternity was still HOT outta the oven over 40 -- on experimental settings, which are supposed to be that.. experimental. Reportedly even more taxing than ultra on average. They still recommend at least a 6C/12T CPU for a smoother experience. The review copies also still had Denuvo protection, which is why they're going to do updates on the CPU benches.
This sounds fantastic. Why can't all games be as optimized as this? Or at least somewhat? How did we get here? (Mind you, Indy was also real fine more recent, the 2060 Super minimum for sure wasn't for like 30 fps low, ditto the CPUs listed). Nvidia doesn't like this!
And Lord British is envious.