Jump to content

DPI Scaling for PE, much more important than resolution


Recommended Posts

For reference, all the possible factors in how the game will look in the hands of the player:

 


Render resolution - the size, in pixels, of the 2D rendered area graphics in the game (of which there will be 2 sets, the larger being 4x larger than the smaller one). We will hopefully be able to manually switch between whichever Render resolution fits our monitor best.

 

Game resolution - the resolution, in pixels, at which you choose to play the game. Can not be higher than your native Screen resolution. If set to lower than the target render resolution you will see less of the area, if set to higher than the target render resolution you will see more of the area.

 

Screen resolution - the native resolution of your screen, and also the highest resolution at which you can experience the game, assuming the game doesn't lock you into one of it's two target resolutions. If your screen resolution is higher than 2560*1440 you should be able to see a larger playable area than the developers "intend" you to. If you play with the 1280*720 target renders you should be able to see a lot more of the area. It'll also be very tiny.


Screen size - the physical size of your monitor, two monitors of different physical size can have the same Screen resolution. This is why it's important to allow the user to set the game to any resolution they like (and why we should be able to choose which target renders are used). Imagine a 27" and a 13" screen, both with 2560*1440 resolution, the game should not force both monitors to play with the same amount of area visible. It makes sense to give the player manual control over this.

 

By the way, I've not gone back to check, but I'm almost entirely sure Josh confirmed that the we will not be forced into playing with one of the two target resolutions. We'll freely be able to set the resolution of our game, within the limitations of our hardware, with the result of the area renders appearing either smaller or larger on screen.

"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

oops.

Edited by mstark
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

Elerond was arguing that we might not be able to change it (and that it would be unnecessary), so I summarized the points for why we should be able to, just in case :).

Edited by mstark
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

  • 3 weeks later...

That's like the Apple Macbook Pro 13", no one will be playing games at 1440p on it, the GPU is the Intel HD 4000. It's completely pointless, no one is going to be designing games for high PPI 13" screens apart from for tablets. It's a very different prospect designing games for a 13" screen and a 27" screen, you want things roughly the same size. You can use scalable graphics for the GUI, or just make differently scaled variants for the same resolution but different PPI. In 3D games you can just zoom in, the only thing that would let you down is texture quality, and perhaps there's a set of ultra resolution textures, but these games wouldn't hope to run on such low performance hardware.

 

I can't see 1440p becoming the new standard for laptops, at least not for the majority of budget and mid-tier ones, it's probably going to 1600 x 900 but it really should be 1920 x 1080. 2560 x 1440 is going to be growing on the desktop, but it's going to take time to gain market share.

Link to comment
Share on other sites

"no one", well, at the moment, of course not. They barely exist yet. I'm talking about the future, the very near future (next 3 years or so). If you want to believe that low PPI screens will remain the norm for the next few years, you're entitled to. From where I'm sitting, there's far too much that points toward an unprecedentedly rapid adoption of Hi-PPI screens across all sectors. Lower res screens will certainly remain the norm until popular software properly supports Hi-PPI screens, which is why I was arguing for support in Project Eternity. 

 

I'm not going to keep arguing for rendering PE in higher resolutions than they're currently doing - they're high res enough to be decently playable on 4k 15" screens (due this fall).

 

A few days after Samsung showed their prototype 13" 4k display, this laptop was announced: http://www.theverge.com/2013/5/23/4357696/hp-envy-pavilion-laptops-3200-1800-touchsmart-ultrabook-display-2013 this is practically my dream monitor, but I wouldn't dream of buying it until Windows/Ubuntu has proper Hi-DPI modes. I doubt sales for it will be very high, but that's not an indication of that the market is not going to eventually adopt Hi-DPI screens.

 

I agree that people aren't going to be hardcore gaming on such screens... yet. Haswell integrated GPUs (Intel HD 5000) are high performing enough to play modern 3D games at low settings on a 4k screen, though. Haswell is released in late June, IIRC. When PE is released, HD 6000 will be just around the corner. It's just up to the games to provide UI/texture scaling options for such high res screens to not make the games unplayable.

 

Windows 8 looks laughably ridiculous on a 13" 4k screen (there are photos circling around), but that's only until Microsoft releases full Hi-DPI support (like the 'retina' version of OSX, with all UI graphics rendered at 2x the size, similarly to how PE has its graphics rendered at 2 different sizes).

 

I personally believe 2560x1440 among gamers will be skipped in favor of 4k 24"/27" screens. There's simply no real market for a 1440p screen, and no specific content for them. The only reason 27" screens currently have that resolution is because it keeps the operating system usable at that particular PPI. The term 4k, or UHD, will be marketed to death by every major TV/Media manufacturer in the world over the next few years. There will be a larger interest among the general public, and easier to offset costs of creating new manufacturing lines for 4k than 1440p.

 

My guess is that over the next few years 1080p will become pretty much standard for 13" screens (once software supports it), with 4k for 15" and above.

Edited by mstark
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

Haswell integrated GPUs (Intel HD 5000) are high performing enough to play modern 3D games at low settings on a 4k screen

Play them badly maybe with frame drops. No one wants to play at 4K with low polygons and blurry textures, no AF. In terms of graphical fidelity it makes no sense, it would be better to render to 1080p and up scale, the game would look a million times better. With a bad frame rates, Haswell is no where near powerful enough.

Edited by AwesomeOcelot
Link to comment
Share on other sites

I completely agree with what you say, if I'm looking at the way current gen games are scaled it won't make sense at all to play anything at a native 4k resolution on a small-ish screen, but future gen games will more than likely have options to optimize for it. Even PE does, to some extent, with it's (assumed) scale-able UI and dual area renders.

 

HD 4000 kind of plays some recent 3D games (not Crysis, CoD, or BF, think DOTA, LOL, D3...) in 4k resolution and it's *almost* playable (~15-25 FPS kind of thing, with bad drops). Tested with dual 1440p monitors, which is pretty close to the pixel count of a 4k screen. HD 5000 performance shown so far looks quite promising compared to that :). HD 4000 plays full bitrate (30Mbps) 4k video like a charm, too. No, HD 5000 is not for hardcore gamers, but if you like to play a game now and then and don't care about the highest settings you'll be able to play most games, probably even at 4k, though doing so doesn't make much sense with current gen games (and their UI).

 

Low quality textures on a higher pixel density screen would look much better than playing with higher quality textures at a lowered, non-native resolution. This can easily be proven by creating JPGs at various compression levels and display them upscaled or 1:1 pixels. A highly compressed JPG at 1:1 will look much better, the PPI density more than makes up for the artifacts created (this is a big thing in web design at the moment, since a low quality, but hugely sized JPG, can actually be smaller in file size and still look better on Hi-PPI screens).

 

It's all speculation, and both the low-ppi and hi-ppi markets will co-exist for quite some time, but I firmly believe the move towards hi-ppi will be relatively fast.

"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

Low quality textures on a higher pixel density screen would look much better than playing with higher quality textures at a lowered, non-native resolution. This can easily be proven by creating JPGs at various compression levels and display them upscaled or 1:1 pixels. A highly compressed JPG at 1:1 will look much better, the PPI density more than makes up for the artifacts created (this is a big thing in web design at the moment, since a low quality, but hugely sized JPG, can actually be smaller in file size and still look better on Hi-PPI screens).

a) Compression levels are a different thing, we're not talking about artefacts.

 

b) Displaying jpegs is different to a 3D game.

 

c) You can test this yourself by playing a game on 720p with the highest resolution textures and playing a game on 1080p with the lowest.

Link to comment
Share on other sites

I can't see 1440p becoming the new standard for laptops, at least not for the majority of budget and mid-tier ones, it's probably going to 1600 x 900 but it really should be 1920 x 1080. 2560 x 1440 is going to be growing on the desktop, but it's going to take time to gain market share.

Like it or not, higher resolution displays are coming and so far they are pretty much on timetable with what they've previously said: http://www.tomshardware.com/news/Intel-Higher-Resolution-Displays-Coming,15329.html

Though I am rather sure that you are right and these Intel Haswell chipsets aren't meant for gaming at that resolution (who plays big gaming titles on a 13" laptop anyway?).

A lot of tech blogs did get that wrong, what Intel was advertising is 4K video playback and encoding e.g. this one got it right: http://www.pcworld.com/article/2037038/intels-haswell-gets-massive-graphics-performance-boost.html

 

The Haswell chip will enable laptops to play 4K video, in which images are displayed at a resolution of 3840 x 2160 pixels, which is four times that of traditional 1080p high-definition video. The graphics processor is also faster at rendering video via a feature called QuickSync, which was slower on previous Intel chips. Some new QuickSync features include faster MPEG video encode and decode.

At the moment even High end graphics cards like the HD 7970, GTX 680 or GTX TITAN struggle keeping up with some of the most graphics intensive games at 4K (only some Dual-GPU cards like the HD 7990 or GTX 690 seem somewhat up to the task).

PC Perspective ran benchmarks for Battlefield 3, Crysis 3, DiRT 3, Far Cry 3, Skyrim and Sleeping Dogs at 4K: http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-High-End-GPUs-Benchmarked-4K-Resolutions

 

I’m personally going to upgrade to the upcoming GTX 770 or 780 Kepler refresh cards soon that are rolling out over the next few weeks: http://wccftech.com/nvidia-geforce-gtx-770-performance-unveiled-10-faster-radeon-hd-7970-ghz/

 

But NVIDIAs upcoming Maxwell chip architecture in 2014 will likely start running even these games acceptably at 4K and market that heavily:

uYIe8.jpg

 

Ironically graphics-unintensive games like Project: Eternity would be a great showcase for 4K, since being largely Pre-Rendered and all they wouldn't require that much power and would likely even run on the aforementioned Intel chipsets fine and realistically I'm not counting on it coming out before 2015 anyway.

Edited by D3xter
Link to comment
Share on other sites

Ironically graphics-unintensive games like Project: Eternity would be a great showcase for 4K, since being largely Pre-Rendered and all they wouldn't require that much power and would likely even run on the aforementioned Intel chipsets fine and realistically I'm not counting on it coming out before 2015 anyway.

 

This is the very reason I started the whole thread about proper high DPI support for PE textures! It'd be an amazing showcase, and would attract quite a bit of attention as such. If you've ever looked at high resolution images on a retina MBP beside the same image displayed on a "regular" laptop screen, you get an idea of just how good the game could look, without requiring insane hardware configurations.

 

Anyway, we're well beyond the stage where it may have made sense to consider it for the game, but given enough popularity, maybe we'll see an ultra-high texture set released at some point in the future. Post-release :). Maybe they could count the man-hours required and kickstart it :biggrin:.

Edited by mstark
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

You're totally right OP, this matter should already be taken seriously by more developers, it can make their work preserve in time, give the user a better presentation and display just as intended.

 

I'm afraid most gamers don't appreciate it, but it's really pitiful to be playing at 1440p a game like for example Metro Last Light, a wonderful game, with great graphics, art and textures, but with 720p 2D UI elements, it doesn't ruin the experience but prevents it to be polished. And it's like this with most contemporary games.

  • Like 1
Link to comment
Share on other sites

Can't wait for Computex to start, some early 4k announcements are starting to leak out. 4k monitors to be available before the end of June, it seems :). That's just about timing the release of Haswell, which will prompt new motherboards with DisplayPort/HDMI that can output a 4k signal. Sweet.

 

http://www.engadget.com/2013/05/31/asus-unveils-31-5-inch-4k-monitor-ahead-of-computex/

"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

  • 1 month later...

I know :D.

 

These are professional grade monitors with 10bit colour depth, though, meeting various specifications & standards, pushing the price up. There's a 4k laptop on the market, with 3 more landing before the end of the summer. WQHD is getting popular, too, on 13" devices (I think I could name 4 or 5 off of the top of my head that'll be out before the end of the summer). To make such devices usable, Windows 8.1 will come with improved DPI scaling, to be released in August.

 

The mere fact that there are entire 4k laptops on the market that costs less than half that of a pro grade 30" 4k monitor should be telling that there soon will be consumer grade ones hitting the market, too. When they don't have to meet high industry standards it brings the price down by a lot. Most 27" pro grade monitors with WQHD cost between $3-5k today, compared to those, the new 30" 4k ones are a bargain.

 

No, these aren't going to be "gamer" laptops, but then, not all of us are hardcore gamers with dedicated machines, this is the stuff that will be owned by the general public :).

 

Since my last post another 2 pro grade 4k screens have been announced, they will likely be in the same price range. Waiting eagerly for consumer grade eIPS/PLS ones :).

Edited by mstark
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

Obviously you'd also want to skip HDMI-based displays until you can get one with HDMI 2.0. You could also get a monitor that uses two dual-link DVI cables right now - I'd wait for DisplayPort 1.2 though.

You'll be fine with HDMI 1.4b and high speed HDMI cable, that's the HDMI standard introduced for 4k and it should start showing up on motherboards soon enough! It's the equivalent of the DP1.2 controller.

"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

If they come up with an IPS-equivalent 120Hz panel, I'm game. ;)

I think TN screens will remain the standard for people worried about monitor response times for quite some time :(. I'd love to see a high frequency IPS screen with high response times (though I'm more worried about colour accurracy and pixel density :p). Dell's new eIPS panels compare well to even overdriven 120hz TN panels in tests: http://www.tftcentral.co.uk/reviews/dell_u2713hm.htm

"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

Obviously you'd also want to skip HDMI-based displays until you can get one with HDMI 2.0. You could also get a monitor that uses two dual-link DVI cables right now - I'd wait for DisplayPort 1.2 though.

You'll be fine with HDMI 1.4b and high speed HDMI cable, that's the HDMI standard introduced for 4k and it should start showing up on motherboards soon enough! It's the equivalent of the DP1.2 controller.

If you're satisfied with a 24Hz or max. 30Hz refresh rate, then yes. That's the highest you'll get from HDMI 1.4a/b though. HDMI 2.0 support is what you'd want from an HDMI-based 4K monitor.

Exile in Torment

 

QblGc0a.png

Link to comment
Share on other sites

If they come up with an IPS-equivalent 120Hz panel, I'm game. ;)

There are overclockable ones. I have a QNIX QX2710 PLS panel that overclocks to 1440p 120Hz, unfortunately it's damaged so I'm getting a warranty replacement. The screens are a bit hit and miss though (with backlight bleed and such).

 

The Catleap 2B extreme ones are a safer buy but they are expensive.

Edited by Sensuki
Link to comment
Share on other sites

 

Obviously you'd also want to skip HDMI-based displays until you can get one with HDMI 2.0. You could also get a monitor that uses two dual-link DVI cables right now - I'd wait for DisplayPort 1.2 though.

You'll be fine with HDMI 1.4b and high speed HDMI cable, that's the HDMI standard introduced for 4k and it should start showing up on motherboards soon enough! It's the equivalent of the DP1.2 controller.

If you're satisfied with a 24Hz or max. 30Hz refresh rate, then yes. That's the highest you'll get from HDMI 1.4a/b though. HDMI 2.0 support is what you'd want from an HDMI-based 4K monitor.

 

Looks like you're right, if I understand this table correctly: http://en.wikipedia.org/wiki/HDMI#Version_comparison

 

I was convinced 1.4b (the double link, high bandwidth version) controllers was *the* 4k HDMI standard, but that it simply hadn't made it to market yet (afaik, they haven't been included in any consumer products so far).

"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...