Based on some forum posts and user reviews, in terms of the Indy game I think this is mostly going to be dependent on resolution and settings one personally tolerates. Outside of the "needs a rtx capable card", ofc. I mean it sounds like the visual differences with settings aren't super extreme - a trend I've noticed happening over recent years in a lot of games, actually - and one has to remember that while actually playing, most aren't staring at every pixel to notice fine differences, like all these comparison videos emphasize.
In terms of upgrading GPU's - I'm not much of a conspiracy theorist but it does make one feel like AAA devs/nvidia are working together to force people to buy the top tier cards - eg, gimping the 60/70/80 series re: vram. >.> I've said it before and I'll say it again - it may be "slower" generation but the 11gb of vram in my 2080ti (and sometimes, dlss, of course) is probably what has allowed to me to marginally still run some of these games at 4k at my personal tolerance of 45-60fps/mid-high settings etc. I refuse to do less than dlss-quality tho. I've tried. "Balanced" dlss still looks like crap, imo. I might as well just drop the native resolution to 1440, then.