Jump to content

Watch Dogs Legion - Xbox Series X/S w/RT vs PC RTX


LadyCrimson

Recommended Posts

Not really gaming news, so I'll put it here?
As I've said before, I don't yet care about ray tracing - hyper-real reflections aren't the graphic enhancements I want - but for this it makes for an interesting performance comparison. Especially the details re: console optimizations to achieve what they do.  This largely applies to this game only, I think - other console games, especially if not heavy on RT, might be closer to a higher tier 20xx. Maybe. (edit: I watched this at YT's 2160 on 1440 monitor).

Anyway, I take away a few things from this video:

--it's too bad dev's don't allow for more of these "hidden" console optimization settings on PC as selectable options for people with lesser systems
--for Watch Dogs, new xbox is maybe on par with 2060 or 2060/S, for "4k"/30fps-ish (forget 60fps).
--for $500 or whatever, new consoles are (as expected/known) leaps and bound above previous and imo a worthy buy for anyone who doesn't want to to/spend PC. Just not quite as uber as some were maybe hoping for.  Could improve even more with newer versions ofc. I may still buy a ps5 a year or two down the road

 

 

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

I'd expect better performance as knowledge of the system increases; and it has to be said that the other technical aspects of WD: Legion don't exactly fill one with confidence in Ubisoft's technical aptitude. They simply won't know what the best bang for buck raytrace settings are yet. The PS5 reviewers seem to be a lot happier with the Spiderman: MM RT solution (and game overall), on a theoretically weaker system.

It also doesn't mean much for Radeon performance. AMD's approach there is clearly to use lots of culling to reduce unnecessary executions hence the massive cache, and the Series X lacks much cache at all- I don't know how much it has since it doesn't seem to be listed anywhere, but there definitely isn't much space for it with 60 (52 active) CUs, 8 CPU cores and a 320bit bus all on a 360mm^2 die. The cache alone on Navi21 is approaching half that area.

Link to comment
Share on other sites

  • 1 year later...

Rather than starting a new topic I'll post my thoughts on the xXxBOxXx here, after owning one for a little while.

Going from 1080p (PS4 Pro) to 4K (xXxBOxXx) is honestly not much of an upgrade. I mean, I can see the difference, but it's not very meaningful, at least not on a 55" TV. Maybe the difference is more meaningful if you have a giant 85" TV? 1080p still looks good at 55", I suppose blown up to 85" it could look a bit pixelated, but I don't have a 85" TV, so that's just speculation.

I find Ray Tracing thoroughly underwhelming. Ray Tracing has been this mythical holy grail of gaming for decades. Now that I've seen this grail, my reaction is "meh". I'm not just talking on this console, either. I've seen RT running on a high-end PC rocking a RTX 3080, where the performance hit is less pronounced than on an AMD- powered machine and it didn't do much for me. I didn't even get a quarter chub from RT, I can barely tell the difference. Volumetric fog had a bigger effect on me (half chub!) when I first saw it in games.

The biggest difference comparing to the PS4 Pro is load times. The xXxBOxXx is very snappy and you rarely have to wait much longer than a handful of seconds for a game to load up. Supposedly, the PS5 is even better in this regard, but I don't have a PS5 so that's hearsay. Going from a 5400 RPM hard drive to a SSD is such a massive difference.

Overall, I'm pretty happy with the refrigerator-looking console. While the graphical difference over the last gen of consoles is quite small, I've been getting rock solid 60 FPS at 4K on the small selection of games I've played and that's nothing to sneeze at. In the games that have a quality mode (30 FPS) I'll check that out for a short while just to see the tiny, and I mean really tiny, difference in amount of shiny, before switching to performance mode (60 FPS) to actually play the game. In fairness, the true showcase games for a console generally don't arrive until year 3, so we haven't seen the full extent of what these MSony consoles can do yet. Maybe when Avowed comes out it will look so spectacular that I will instantly soil my briefs?

On paper, the xXxBOxXx is a more powerful machine than the PS5, though we haven't seen that in practice yet. I didn't chose Microsoft over Sony for the difference in CUs, though. I made the choice because Microsoft bought up a bunch of studios, most notably Obsidian. I'm pretty sure we won't be seeing Avowed on PS5.

Final thoughts: Good console, happy with it, didn't change my life.

As an aside, having seen the difference between 1080p and 4K firsthand (not much), I don't see the point of 8K. I suppose that if you are filthy rich and live in a mansion with a 40' by 40' living room and you have a 200" TV (do they make them that big?) then it makes some kind of sense, but for the other 98% of us, that's an absurd premium to pay (both for the TV itself and the machine to power it) for insanely diminishing returns.

  • Like 2

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

When I upgraded from last-gen (PS4 Pro + XBONEX) to PS5 + XB Series X, the differences in visuals were not clear to me as well, and this is on a Samsung 85" QLED. Granted those old consoles were already running most games in 1440p, so bumping that to 4K only gets a bit of improvement. What stood out to me more is the higher framerates and faster loading. I've been underwhelmed with the Series X raytracing implementation in games so far (Watchdogs and Cyberpunk /w Xbox RT is pretty bad actually). However on PS5 there's Ratchet and Clank and Horizon Forbidden West (and sometimes Gran Turismo 7) which have been a good showcase on what raytracing can do; especially when it's baked into the look of the game from the start, and not an afterthought. Ratchet is probably the only game not possible on last-gen, due to the way it uses the PS5 NVME drive to insta-load complex levels on top of levels

  • Like 2
Link to comment
Share on other sites

Monitors: I don't know what other people are like, but at my own desk, I just used a tape measure to see that my eyes are 30-31 inches away from my monitors. That's about where I'm comfortable with general usage - sometimes I might get a little closer if I'm trying to look at something only a pixel or two in size, sometimes I sit back an entire additional foot if I'm playing something with a controller. I'm not sure of the exact real life size that I want each pixel to effectively be, but it seems like you need a pretty good-sized monitor to take advantage of 2160p as it is unless you like sitting very close to your screens...but conversely, I don't really enjoy a screen enveloping my entire field of field like other people seem to, so I'm pretty loathe to even consider 2160p - the panels get too big and the pixel density too extreme for it to be comfortable for me, and messing around with resolution and scaling settings for different uses also kind of sucks.

TVs: Here, I can perhaps see a better application, but gaming tends to be less than stellar on modern TVs due to bad input delay anyways, so it's not really a key focus for me...and yeah, you either need a very large TV or to be sitting uncomfortably close to it to really appreciate the level of detail increasing. My own TV is 4K, but it's the same size as Keyrock's and so unless I'm sitting right in front of it and directly flipping between 1080p and 2160p sources, the difference is literally unobservable. For cinema, there's honestly a much bigger difference in quality between cable vs. streaming vs. blurays than there is between 1080p and 2160p IMO. I haven't had cable for a few years now since I now obtain all of my live sports streams and that was literally the only thing that ever made cable even remotely worthwhile, but I remember being pretty stunned when I actually stopped to do some proper image quality comparisons between all the different sources - cable was by far the worst source and looked like complete junk in comparison to even just streaming, and never mind BDs.

Edited by Bartimaeus
  • Like 2
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

Yep. To notice much of a difference you need to have a gigantic screen or sit really close to a regular sized one - the human eye has only so much resolution, and we're pretty close to that with 4k already. The only thing it does is create more screen real eastate for working, if you have a screen to match and the inclination to use one of them. My colleague at work has a gigantic screen rather than two or more separate ones, and he's constantly moving windows back and fro and what not, and it's just making me dizzy. One giant screen where my eyes have to dart from one corner to the next just so I can have four Full-HD sized windows open next to each other at the same time? Yeah, no thanks. That's definitely not for me.

8k is going to be the same sort of voodoo we now have with super expensive (and super not doing anything) speaker cables, outside of a very few special interest applications or theater sized silver screens. Pretty sure you'll also get the same crowd yelling I CAN SEE A DIFFERENCE just like they can HEAR the difference between high quality speaker cables and cables for 4000$ per yard. Sure you "can" hear the difference. I'd hear a difference too if I'd been had like that. Doesn't mean it's there. :yes:

Edited by majestic
  • Like 2

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

When I first saw movies and shows in 4K it felt really... off, for lack of a better term. It almost made me feel dizzy. Like when you go to the eye doctor and they shoot the puff of air into your eye and your vision is weird for a couple of hours. It felt sort of like that. I got used to it and now it's great. For whatever reason, I didn't have that issue with games, only shows and movies. :shrugz:

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

1 hour ago, Keyrock said:

When I first saw movies and shows in 4K it felt really... off, for lack of a better term. It almost made me feel dizzy. Like when you go to the eye doctor and they shoot the puff of air into your eye and your vision is weird for a couple of hours. It felt sort of like that. I got used to it and now it's great. For whatever reason, I didn't have that issue with games, only shows and movies. :shrugz:

TV setting? You need to fiddle with the image setting something fierce before it stops being weird, unless it comes with a usable film mode, and even those require... adjustments. My least favorite part of modern TVs, telling them that they can stuff all their image enhancing doodads up theirs. :yes:

No mind to think. No will to break. No voice to cry suffering.

Link to comment
Share on other sites

I don't even remember making this topic.  :lol:

18 hours ago, majestic said:

TV setting? You need to fiddle with the image setting something fierce before it stops being weird,

Yeah, it's probably not a 4k thing, it's all those settings TV software likes to default to. TruMotion, RealCinema, "Motion Eye Care" (or whatever other brand-name names they give such), smoothing/noise/other "improvements". Back when I first saw early 4k's in stores I thought floor examples looked bizarre from the illusion of too much depth perception where everything looks hyper-focused but with this odd dimensional flatness.  Get rid of all those kinds of settings and it goes back to ... not looking like that. All those settings seem to emulate a certain type of look - one that you could do/get decades ago before 4k (there's some episodes of Twilight Zone that remind me of the effect a little, can't recall the reason for the change there, filmingframe speed or something).

Also, I still say our old plasma TV still has all LED/QLED/OLED/whateverLED beat in terms of contrast/brightness while not crushing dark vs. light in scenes. There are plenty of aspects to image quality/perception and visual "pop" that have nothing to do with screen resolution size.

Edit: doesn't mean I'd give up my OLED at the moment tho.  At least not vs. a non OLED, LED.  :p

Edited by LadyCrimson
  • Like 1
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

Yeah I recently got the Gigabyte FO48U and it's been cool so far; 4K 120hz, OLED, HDR, Gsync, acts like a gaming monitor (and not a Smart TV). Colors and brightness really pop. Infinity blacks. I guess only need to watch out for that screen burn-in that OLED is famous for

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...