Jump to content

DPI Scaling for PE, much more important than resolution


Recommended Posts

First of all its not as bad as may have been said here, yes it will blur a bit when running on a lower resolution than on the original screen, yes it will be significant. It will however still be perfectly playable, just try it yourself by turning down the resolution in game or in windows, it will still be acceptable. It will look worse than otherwise possible but in truth its not nearly unplayable at lower resolutions. So while I would say a DPI scaling setting would most definitely be preferred if included it is certainly not a must have, running the game at a lower resolution would also be perfectly fine.

One other thing to keep in mind, some players still play on 27 or some 30 inch screens with the same resolution as a 21/23 inch screen 1920x1200 or 1920x1080 those people could also benefit from dpi scaling, instead of the UI being blown up way to much they could get it back to more reasonable sizes.

 

One other thing people need to keep in mind dpi of a laptop should generally be higher than the dpi on a laptop than again the dpi on a television. As distance increases items should also scale up.

 

The last factor is also personal preference, some people like elements smaller than others it just is that way.

 

All those things point in favor of a setting over a system that detects it.

Edited by qlum
  • Like 1
Link to comment
Share on other sites

While I agree that pre-rendered artwork should be done at as high a resolution as possible, focusing on DPI exclusively would be a mistake - it would result in scenery elements being rendered at the same (real) size onscreen regardless of the device (e.g. a statue appearing 5cm tall, regardless of whether the device used is a smartphone or a 30-inch monitor).

 

Instead, I would argue in favour of the system used by many games currently, where you specify a resolution and the game engine downsamples its images to best suit that resolution (if zoom is available, that downsampling will need to be at the highest zoomable resolution).

 

Mstark's point about upscaling resulting in a blurry picture (in most cases) is valid - but the cause of this is not the upscaling itself, but the filtering applied afterwards by the monitor or graphics card driver. Such filtering is often beneficial with non-integer upscaling (to smooth out otherwise uneven pixel distribution) but should be optional for integer scaling.

 

Unfortunately, most monitors and graphics drivers do not provide an option to disable or adjust such filtering.

 

A workaround for this is for the game engine to always render at the display's native resolution and take care of upscaling itself (Dosbox is a good example of this - set its window display to native resolution and the renderer to openglnb and no filtering is applied, resulting in sharp, clear displays of older games). It can then offer the choice of scaling/filtering algorithm to suit users' tastes (some may prefer some smoothing to 2x2 or 3x3 pixel rendering).

 

Such scaling could affect performance but this should be marginal, and would seem be the best method of "future proofing" a game for higher resolutions.

I'm just thankful that the wave of smartphones finally brought High DPI screens to the attention of a broader market, that way monitor manufacturers can't continue to ignore this issue as they have been doing for ages (to great profitability for them).

 

I'm sorry to have to dredge this up from so many posts back, but to suggest a monitor manufacturer "conspiracy" against higher resolution devices is highly speculative. A more likely reason is technical - DVI-D reached its bandwidth maximum at 2560x1600x60fps and a new video connection standard has to be agreed between monitor and graphics card manufacturers before higher resolutions can be offered (the current anointed successor to DVI-D, DisplayPort, only offers a modest boost - even the 4-lane version maxes out at 2560x1600x120fps).

 

Cross-sector co-operation on new standards (sadly) tends to lag high-end consumer demand - previous examples include removable media (no standard agreed after 1.44MB floppies), expansion buses (ISA lasting 5+ years longer than it should have before VL-Bus and PCI displaced it) and, going back to displays again, SuperVGA standards (where games spent almost a decade unable to progress beyond MCGA's 320x200 256-colour graphics due to proprietary implementations of 640x480 and higher resolutions).

 

So this is more fairly described as screwup rather than conspiracy - on the other hand, it has meant high-end monitors having a useful lifespan longer than any other computer peripheral.

Edited by AstralWanderer
  • Like 1
Link to comment
Share on other sites

Calling it a conspiracy is indeed a bit extreme, but as long as it usually takes to agree on new standards, and manufacturing parts to handle these standards, I can't help but believing that there's been some form of stalling going on (as there always is... heck, we're still stuck with disc-format media when solid state is both cheaper and with higher capacity!). It's true that the market's barely ready for high DPI screens as it is: software will have to catch up, more programs will need DPI scaling options - imagine using, say, Office on a 400dpi screen.

 

With DPI scaling I'm not suggesting making it auto detectable (eg. a statue always appears 5cm on every screen and there's nothing to do about it), but user selectable. A user can select any DPI scaling they want, out of the available options (or unlimited options, if the engine can scale down the original high-res artwork in real-time without lag). Making UI appear larger or smaller, and backgrounds closer or farther, at your will, while keeping the game at native resolution.

  • Like 1
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

OP has a point. While it might be reasonable to concentrate on resolutions of the current generation considering the limited budget and timeframe, Obsidian should at least plan for higher DPI. The DPI options at release-time depending, of course, on how much it would affect the art budget. Anyway, I'm sure Obsidian will either provide higher DPI options as an optional download themselves or release high resolution assets for modders to finalize. They seem to care about these kinds of things. :) (I'm thinking about KOTOR2 and the cinematic+sound upgrades.)

 

EDIT: Better scaling could make the point somewhat less important down the road, but still.

Edited by Arhiippa
  • Like 1

And yes, I know my profile picture is blasphemy on this forum, but I didn't have the audacity to use The Nameless One.

Link to comment
Share on other sites

Calling it a conspiracy is indeed a bit extreme, but as long as it usually takes to agree on new standards, and manufacturing parts to handle these standards, I can't help but believing that there's been some form of stalling going on (as there always is... heck, we're still stuck with disc-format media when solid state is both cheaper and with higher capacity!). It's true that the market's barely ready for high DPI screens as it is: software will have to catch up, more programs will need DPI scaling options - imagine using, say, Office on a 400dpi screen.

 

With DPI scaling I'm not suggesting making it auto detectable (eg. a statue always appears 5cm on every screen and there's nothing to do about it), but user selectable. A user can select any DPI scaling they want, out of the available options (or unlimited options, if the engine can scale down the original high-res artwork in real-time without lag). Making UI appear larger or smaller, and backgrounds closer or farther, at your will, while keeping the game at native resolution.

 

Uhhh solid state isn't as long-lasting and reusable as disc-format.

My blog is where I'm keeping a record of all of my suggestions and bug mentions.

http://hormalakh.blogspot.com/  UPDATED 9/26/2014

My DXdiag:

http://hormalakh.blogspot.com/2014/08/beta-begins-v257.html

Link to comment
Share on other sites

<p>

Calling it a conspiracy is indeed a bit extreme, but as long as it usually takes to agree on new standards, and manufacturing parts to handle these standards, I can't help but believing that there's been some form of stalling going on (as there always is... heck, we're still stuck with disc-format media when solid state is both cheaper and with higher capacity!). It's true that the market's barely ready for high DPI screens as it is...

In some ways, you've answered your question - market barely there and not worth the risk (especially with the current economic climate). I have a Dell 30" monitor (3007WFP-HC) with 2560x1600 resolution, so could be considered a likely candidate for a large super hi-res monitor. However I'm more than happy with the image quality and the combination of ClearType (for text) and AA (for games) deals with any visible pixellation. I'd need more than a modest boost in resolution/DPI to consider a monitor/graphics card upgrade (two cards since I use SLI) likely to cost several hundred pounds.

 

The other side of the coin is that display innovation always starts small - OLEDs came to phones/MP3 players first before migrating upwards to laptops and we're seeing the same with "retina" displays. Now fair do's to Apple, they've implemented a new piece of hardware before anyone else (with their top-to-bottom control of software and hardware they're in the best position to) as they did with USB connectivity on the original iMac - and that will likely percolate through to the PC market in time. But the cost will be high and the benefits small, save for those involved in pro image processing. And even Apple have had to start small with this - I doubt we'll be seeing "retina" on an Apple desktop for a year or so yet.

 

As for the comparison between hard discs and solid state storage, I don't know where you got your figures from but the current cost for a 4TB hard drive is currently about £0.046/GB while the largest SSD I could find (1TB) costs £1.70/GB - nearly 50 times the cost.

With DPI scaling I'm not suggesting making it auto detectable (eg. a statue always appears 5cm on every screen and there's nothing to do about it), but user selectable. A user can select any DPI scaling they want, out of the available options (or unlimited options, if the engine can scale down the original high-res artwork in real-time without lag). Making UI appear larger or smaller, and backgrounds closer or farther, at your will, while keeping the game at native resolution.

 

In which case, how does this differ from selectable resolution which most games offer currently? It would indeed be better to stick with resolution as that is a term more familiar to most gamers than DPI - or if in-game scaling is to be offered, just choosing the scaling factor/algorithm.

 

What this seems to come down to is asking for the highest quality renders and using the display's native resolution so the game engine can control upscaling/filtering. That seems like the best approach for current and future PC systems.

Link to comment
Share on other sites

As for the comparison between hard discs and solid state storage, I don't know where you got your figures from but the current cost for a 4TB hard drive is currently about £0.046/GB while the largest SSD I could find (1TB) costs £1.70/GB - nearly 50 times the cost.

Oops, I was a bit unclear there. I meant solid state as compared to disc media like DVDs and BluRay. The price difference is much smaller there, at ~£6 for a 25GB BD-RE vs a 32GB memory stick for ~£10. Difference in production/delivery costs would be even smaller. The problem, again, is that the market isn't ready for it.

 

Much of what I say here you might already know/realize (TL;DR: it's explaining why the term "resolution" is dying, and DPI scaling is living. Changing resolution allows a screen to be experienced sub-optimally; a cheap way of "zooming", if you will. DPI scaling has the same effect, but doesn't lose screen real estate/quality by lowering the amount of rendered pixels),

 

In which case, how does this differ from selectable resolution which most games offer currently? It would indeed be better to stick with resolution as that is a term more familiar to most gamers than DP
It'd differ, because you could choose for that statue to appear any size you'd like, while being rendered at your screens native resolution (with an upper limit, of course). Apple has decided to completely remove the familiar resolution switch, because it's actually kind of idiotic to allow a user to switch from the native resolution of a screen, since it will give that user an inferior experience. So what do you do instead of allowing a user to choose a non-native resolution, to hinder the from UI apearing tiny? Apple handles this by providing a few (2?) different sets of graphical assets that the user can toggle between (which automatically adjusts font-sizes, too), that way, a user can choose the option that fits them (usually depending on eyesight), without losing screen real estate & quality by using a lower, non-native resolution. This is also the solution the web is using for presenting imagery on high DPI screens.

 

- or if in-game scaling is to be offered, just choosing the scaling factor
If I correctly understand what you mean here, this is exactly what I'm suggesting they should do. Get rid of the possibility to play the game at non-native resolutions, and offer DPI toggles to allow a user to choose zoom/UI size with a DPI toggle. (Resolution toggling is a must for 3D games that are limited in polygon count by your graphics card, but for a top-down 2D game it's sub-optimal). As Apple understood when removing resolution options from its Retina MBP, it's just pure bad business to allow a customer to experience it at anything else than in its full capability. It's to protect users from themselves, the average user wouldn't realize that using a screen at a non-native resolution is what makes it look ****, DPI scaling is going to become a thing, unfamiliar as it is, even if it means more work for software devs and graphical artists. (Today, apps that want to make it in the Apple app store are pretty much required to provide two sets of assets, one normal and one high DPI, or it'll look bad).

 

As I'm sure you understand, just offering different resolution options would not solve the issue with the statue becoming tiny on high resolution monitors, you'd have to choose to play the game at a non-native resolution. Something I just wouldn't do with a 2D game. If there's no scaling, a single set of assets is going to appear large on low-res monitors, normal on our monitors, and tiny on newer monitors.

 

I believe it indeed comes down to what you're saying in your final paragraph. I would, however, like to keep manual control over how "zoomed in" I want my game to appear. If the engine can't handle down sampling of high res artwork in real-time (ideal), that'd mean they'd have to provide us with a way of toggling between 2-3 different pre-rendered sets, which is what Apple are doing on their retina screens, and what I am doing, day to day, making websites that need to have their logos/imagery appear sharp on every kind of screen, without appearing too small or too large. People/businesses pay a lot for this, it's becoming a big thing.

 

I hate mentioning Apple so much, really not a fan, but they are very much helping pushing for a paradigm shift in display technology that's long overdue. Their user friendly solutions (eg. DPI scaling) are making the potential problems that high DPI screens bring seem trivial, and like something we should have been doing from the start. We're now so stuck with games and programs so highly optimized to 100-120DPI that it's hard to move away from it.

Edited by mstark
  • Like 1
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

It'd differ, because you could choose for that statue to appear any size you'd like, while being rendered at your screens native resolution (with an upper limit, of course). Apple has decided to completely remove the familiar resolution switch, because it's actually kind of idiotic to allow a user to switch from the native resolution of a screen, since it will give that user an inferior experience. So what do you do instead of allowing a user to choose a non-native resolution, to hinder the from UI apearing tiny? Apple handles this by providing a few (2?) different sets of graphical assets that the user can toggle between (which automatically adjusts font-sizes, too), that way, a user can choose the option that fits them (usually depending on eyesight), without losing screen real estate & quality by using a lower, non-native resolution. This is also the solution the web is using for presenting imagery on high DPI screens.

Here is your problem, this is how Apple handles it. Apple also says to the consumer "this is your hardware and it isn't negotiable". Their solutions simply don't work outside of their own hardware. There could be people who want to play this game on 5 year old (maybe older) pc's with who knows what kind of monitor, people who will play it on multiple monitors, people like me who use a 40+ inch HDTV as my "monitor", some people may even run their pc through a high end projector for all we know. It is not as simple as you are trying to make it sound. Forget about considerations like OS compatibility (people still use XP, yeah I know), graphics solutions, and everything else.

 

Myself for example... until online video streaming reaches a point where we can go beyond 1080p (this isn't happening ANY time soon) there is literally no point to go to a retina monitor, higher resolution monitor, or anything else. You have to realize that as of right now even video capture devices save insanely expensive top end professional grade hardware can't even capture 1080p at 60 frames yet to begin with.

Edited by Karkarov
Link to comment
Share on other sites

If I correctly understand what you mean here, this is exactly what I'm suggesting they should do. Get rid of the possibility to play the game at non-native resolutions, and offer DPI toggles to allow a user to choose zoom/UI size with a DPI toggle.

We do seem to be talking about a similar thing now - but rather than specifying DPI (which most people won't know - how many monitors mention DPI in place of resolution?) I'd suggest listing scaling options (with sample graphics, like a statue, drawn to different scales to show the difference).

 

Defaulting to native resolution will make sense most of the time but there should be an override to allow users to specify a preferred resolution instead (to cover situations like non-EDID displays where native resolution figures can't be determined or emulators/terminal servers where a "virtual" monitor may be in use).

Edited by AstralWanderer
Link to comment
Share on other sites

Too much apple fanboi, web-dev, not enough actual sense being spoken here. I ain't even going to bother, I'd sooner argue religion with a extremist muslim.

 

Edit: This thread is pointless.

Edited by Nightshape
  • Like 1

I came up with Crate 3.0 technology. 

Crate 4.0 - we shall just have to wait and see.

Down and out on the Solomani Rim
Now the Spinward Marches don't look so GRIM!


 

Link to comment
Share on other sites

If I correctly understand what you mean here, this is exactly what I'm suggesting they should do. Get rid of the possibility to play the game at non-native resolutions, and offer DPI toggles to allow a user to choose zoom/UI size with a DPI toggle.

We do seem to be talking about a similar thing now - but rather than specifying DPI (which most people won't know - how many monitors mention DPI in place of resolution?) I'd suggest listing scaling options (with sample graphics, like a statue, drawn to different scales to show the difference).

 

Defaulting to native resolution will make sense most of the time but there should be an override to allow users to specify a preferred resolution instead (to cover situations like non-EDID displays where native resolution figures can't be determined or emulators/terminal servers where a "virtual" monitor may be in use).

Many high DPI screens nowadays do mention their DPI (often as PPI), along with resolution (or, in the case of Amazon's kindles, just PPI). (Not saying they shouldn't also mention the resolution).

 

These scaling options you mention, if we are talking about the same thing, directly relate to scaling the assets to be appropriately sized for a monitors DPI, hence calling it DPI scaling (the term exists, I didn't invent it, there just isn't much awareness of it yet). The in-game option could be named anything, with a user friendly explanation to go with it, but what I think you are talking about is scaling assets to be made readable/usable for that particular screens DPI. DPI scaling, or "to make the game look as good as possible, use this option to scale the game assets to look good on your screens native resolution".

 

The term DPI scaling, which I agree may sound stupid, only had to be invented because we're so used to ~120 DPI. Today, everything (font sizes, user interfaces, images) is optimized to be readable at this DPI. Now that we're seeing screens twice this DPI, assets need to be "scaled" to twice the size to look similar, and remain readable, on these screens.

 

In the future we might see 4x DPI scaling, or if they invent a better name for it, I'm just as happy with that. The term only exists because we're stuck with a DPI "norm" that should never have been there to begin with. We'll likely see more DPI agnostic design approaches in the near future (hello Windows 8), that will work on any screen, not just the one with a "normal" DPI. Instead of changing resolutions, we'll be changing "scale" of screens, never moving away from native resolution.

 

The shift certainly will be confusing for consumers, Apple (as much as it pains me to say it) gallantly sidestepped this by calling their screens "Retina", popularizing high DPI while avoiding the potentially confusing discussion about DPI, and turning it into a massive marketing term. They took the (long needed) step and removed the ability to change the resolution to anything but native while in the desktop environment, since the only thing that really should matter is scaling the screens fonts & assets to be readable to you, while remaining at native resolution. If you're not playing a 3D game, changing your resolution away from native is nothing but a cheap way of "zooming", since it saves the company from having to prepare optimized high & low DPI assets.

 

Yeah, I suppose for some situations they'd have to keep a resolution option. We'll also always need resolution options for as long as cards and cables have upper resolution limits, and for 3D games' FPS performance.

Edited by mstark
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

Predicting the future is...difficult. The next big thing in computer monitors may be OLED monitors. Personally I'm kind of waiting for that to upgrade my monitor. In that case resolution and/or PPI may not increase dramatically any time soon. Of course, eventually it is safe to assume that both PPI and screen size will increase dramatically. But this "eventually" may be more than just a single decade. Not everyone believes that the jump from CRTs to LCDs was progress. Particularly for gaming. Originally LCDs were adopted by most people just because they looked better on their desks and they were lighter to carry. But the market spoke and that was the end of CRTs. If I move my mouse pointer very quickly across my screen I see several of them. On a CRT I would just see one. There are LCD monitors that are better at fast motion, but few if any of those are S-IPS panels. Most are TFTs with their limiting viewing angles and other faults.

 

I would certainly be in favor of Obsidian trying to guess at what the most likely future resolution will be (maybe 3840x2400 (2x 24") or 3840x2160 (4k/2160p) or 5120x3200 (2x 30") or 7680x3420 (8k/4320p). Any of these choices would mean a lot more pixels than 1920x1200. If you assume 32 bits per pixel then you end up with 9,216,000 bytes or around 8.8 MB of storage for each screen of pixels you draw. 4K monitors are probably the most likely type. So 3840 x 2160 x 32 bpp would give you 31.64 MB of storage per screen or 3.6x the amount of storage. If they are worried about needing to deliver blurays instead of DVDs then they will need more than 3 times as many blurays if planning for 4K monitors. Of course the real unanswered question is whether planning for 4K displays would create more work for the artists. Presumably the point of all those pixels is to display something more detailed than you could otherwise. I've always wanted to see the kind of detailed drawings you see in concept art drawn in the actual game world, but that implies a certain photorealistic LoD that is supposed to be very expensive.

  • Like 1

JoshSawyer: Listening to feedback from the fans has helped us realize that people can be pretty polarized on what they want, even among a group of people ostensibly united by a love of the same games. For us, that means prioritizing options is important. If people don’t like a certain aspect of how skill checks are presented or how combat works, we should give them the ability to turn that off, resources permitting.

.
.
Link to comment
Share on other sites

The other side of the coin is that display innovation always starts small - OLEDs came to phones/MP3 players first before migrating upwards to laptops and we're seeing the same with "retina" displays. Now fair do's to Apple, they've implemented a new piece of hardware before anyone else (with their top-to-bottom control of software and hardware they're in the best position to) as they did with USB connectivity on the original iMac - and that will likely percolate through to the PC market in time. But the cost will be high and the benefits small, save for those involved in pro image processing. And even Apple have had to start small with this - I doubt we'll be seeing "retina" on an Apple desktop for a year or so yet.

No, no they didn't, just like they didn't "invent" smartphones, tablets, MP3 players, multitouch or any number of things that they are being lauded for and makes people that say that look hilarious. They just take something that has existed for half a decade, make it all shiny, triple the price, put their Logo on it and sell it like it's the newest gospel...

 

http://www-03.ibm.com/press/us/en/pressrelease/1180.wss

http://en.wikipedia.org/wiki/4K_resolution#List_of_4K_monitors_and_projectors <-- see Apple anywhere on there?

  • Like 1
Link to comment
Share on other sites

Predicting the future is...difficult. The next big thing in computer monitors may be OLED monitors. Personally I'm kind of waiting for that to upgrade my monitor. In that case resolution and/or PPI may not increase dramatically any time soon. Of course, eventually it is safe to assume that both PPI and screen size will increase dramatically. But this "eventually" may be more than just a single decade.

If larger OLED screens were to be stuck at current pixel density levels for that long, it seems likely that many users would opt for high-res LCDs instead. Looking at the mobile space where OLEDs are relatively common now, OLEDs do trail LCDs a bit in terms of pixel density, but the Galaxy Note II still comes with a highly respectable 267 ppi 5,5" screen.

 

There are LCD monitors that are better at fast motion, but few if any of those are S-IPS panels. Most are TFTs with their limiting viewing angles and other faults.

Pet peeve: You probably mean TN (twisted nematic). Practically all active matrix LCDs and OLEDs are TFTs (the thin film transistor layer is the active matrix).

  • Like 1
Link to comment
Share on other sites

I agree with the OP whole-heartedly.

 

People calling this thread useless clearly have no understanding of how upscaling works, or of what it is altogether. What the OP is suggesting makes perfect sense, and he has explained it in every possible way (even with graphical examples). There really is no reason to not make P:E future-proof, and account for higher DPI screens.

  • Like 1

"Time is not your enemy. Forever is."

— Fall-From-Grace, Planescape: Torment

"It's the questions we can't answer that teach us the most. They teach us how to think. If you give a man an answer, all he gains is a little fact. But give him a question, and he'll look for his own answers."

— Kvothe, The Wise Man's Fears

My Deadfire mods: Brilliant Mod | Faster Deadfire | Deadfire Unnerfed | Helwalker Rekke | Permanent Per-Rest Bonuses | PoE Items for Deadfire | No Recyled Icons | Soul Charged Nautilus

 

Link to comment
Share on other sites

https://plus.google.com/+LinusTorvalds/posts/ByVPmsSeSEG

 

A must read for this thread and the devs. When people like Linus Torvalds are arguing for a change in resolution (I know this is DPI not resolution but they are very related), then this is something that the devs should take very seriously.

  • Like 1

My blog is where I'm keeping a record of all of my suggestions and bug mentions.

http://hormalakh.blogspot.com/  UPDATED 9/26/2014

My DXdiag:

http://hormalakh.blogspot.com/2014/08/beta-begins-v257.html

Link to comment
Share on other sites

https://plus.google....sts/ByVPmsSeSEG

 

A must read for this thread and the devs. When people like Linus Torvalds are arguing for a change in resolution (I know this is DPI not resolution but they are very related), then this is something that the devs should take very seriously.

Haha, I love that man and his eloquence.

 

Higher resolution on monitors across the board does indeed equate higher DPI, and the whole problem with "small fonts" is what the scaling discussed in this thread aims to solve, along with graphical asset quality.

"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

I very much agree with the OP. I find it really hard to play old RPGs because everything in games designed around 640x480 boxes becomes either incredibly tiny or incredibly blurry on my 20" 1280x1024 monitor (let alone my 40" HDTV). This will be even more of a problem with the DPI scaling he is talking about.

  • Like 1
Link to comment
Share on other sites

  • 4 weeks later...

For anyone interested, the first few consumer grade 4k monitors are starting to ship soon: http://www.engadget....zo-lcd-monitor/

 

While this model is aimed at professionals, at a steep price point, I'm fully expecting CES in 2013 to sport a number of 4k monitors, and graphics cards that support them.

 

While high DPI desktop monitors are premium now (at $5,500 they're priced similarly to what 1080p monitors/TVs cost when they were new), mid-2014 will certainly see them entering a more mainstream market :).

Edited by mstark
  • Like 1
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

  • 2 weeks later...

I'm all for making the game look good in the future, if it doesn't cost extra, but is 2x DPI free in computer power?

 

I might be ignorant on the topic, but with Ultra HD/4K at what? 3840 x 2160? That'll be fun to pair with a PC, when the CPU speed benefits have close to halted and new development is more about power consumption than speed. An overclocked last-gen Intel I7 paired with a Radeon HD 4890 can hardly run ArmA2, a game from 2011, at 1280 x 1024 in half-assed quality, with a very low ~35 frames per second (no, the GPU is not the bottleneck).

 

So if this cost any extra load on the CPU or extra money in development, I'd rather see other areas prioritized instead. Again, unless it is free in both money, time and PC specification requirements, but I have no idea if it is? Feel free to enlighten me though.

Edited by TheForumTroll
Link to comment
Share on other sites

I'm all for making the game look good in the future, if it doesn't cost extra, but is 2x DPI free in computer power?

 

I might be ignorant on the topic, but with Ultra HD/4K at what? 3840 x 2160? That'll be fun to pair with a PC, when the CPU speed benefits have close to halted and new development is more about power consumption than speed. An overclocked last-gen Intel I7 paired with a Radeon HD 4890 can hardly run ArmA2, a game from 2011, at 1280 x 1024 in half-assed quality, with a very low ~35 frames per second (no, the GPU is not the bottleneck).

 

So if this cost any extra load on the CPU or extra money in development, I'd rather see other areas prioritized instead. Again, unless it is free in both money, time and PC specification requirements, but I have no idea if it is? Feel free to enlighten me though.

 

1. Of course it's not free in terms of fill rate, memory, cycles.

 

2. CPU speed benefits have not close to halted.

 

3. Arma 2 is not a common case, its a sim, the view distance is large, the amount people. It's not the most optimized game, and it doesn't matter when the game was made (the image is posted says the benchmark was done in 2009). Project Eternity is a very different kind of game.

 

4. You say the quality is half-assed, but the AF is 16:1, that needs a lot of memory bandwidth.

 

5. The CPU is not last gen, there are 2 newer generations.

 

6. It's not going to be free in time and money, but you don't have to use it so the specification requirements should stay the same.

Link to comment
Share on other sites

Since we're talking about outputting mainly 2D graphics, there won't be any problems whatsoever with 4k performance--one of the huge benefits of PE compared to the rest of the market is that it'll look so fantastic in 4k, and still be playable, as compared to 3D games where polygons & textures matter. Comparing PE to a badly optimized 3D game (hello ArmA or Chrysis) isn't going to tell you anything--you can't compare rendering polygons+applied textures vs. rendering just textures (2D assets). You might experience some lag (in the game it will be known as loading times) if you open a 20k by 20k jpeg, and a 2x one would be 40k by 40k pixels. The only thing you'd have to really worry about is the 3D assets, simply scaling them to twice the size will not increase their polygon count and won't affect performance, but you might run into problems on graphics cards with low memory if you increase the texture quality of the 3D models.

 

Many current gen graphics cards have the capability to output 4k (most common "4k" res: 3840 x 2160 and 4096 x 2016), incl. intel HD 4000 & current gen AMD/nVidia cards. Problem is that motherboards, cables, and ports are behind (understandably so, there are exceptionally few 4k screens on the market). The HDMI 1.4 specification supports 4k video over a single cable, but so far I know no motherboard/card that has implemented it to its full capacity yet (and why should they? They'd much rather sell new motherboards when 4k screens become available than include it now).

 

Just wait for CES in January, and we should start seeing the standard being more adopted by manufacturers, now that the screens are on their way :)

Edited by mstark
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

  • 3 weeks later...

For anyone not understanding what this thread is about, this post by Sawyer shows it very clearly:

 

http://forums.obsidian.net/topic/63020-project-eternity-update-36-off-to-our-elfhomes-but-first/page__st__160#entry1293137

 

I'm glad the team is taking exactly the route I outlines in the first post for producing high/low resolution assets that'll scale better to different DPI screens :).

"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

  • 3 months later...

For anyone not understanding what this thread is about, this post by Sawyer shows it very clearly:

 

http://forums.obsidian.net/topic/63020-project-eternity-update-36-off-to-our-elfhomes-but-first/page__st__160?do=findComment&comment=1293137

 

I'm glad the team is taking exactly the route I outlines in the first post for producing high/low resolution assets that'll scale better to different DPI screens :).

But they will reach a limit at 2560x1440 monitors. If you have a higher res monitor, you will see more of the map but it will look very tiny on a 24" 4k display for instance. If PE had even higher resolution assets we could have one more level to use for future "retina" pc displays.

 

http://techreport.com/news/24675/toshiba-new-13-inch-ultrabook-sports-2560x1440-display

And they are coming already! Wouldn't this portable be useless for PE at native resolution? You would have to set it at 1920x1080 to play PE and still be able to read the text.

 

Yes PE have a 2560x1440 mode but that mode is for 27" displays. Which is why we need three DPI levels instead of a "high res" and "low res" mode. People really need to get used to the DPI option of OSs and games as we get retina PC displays.

Edited by aka.mecha
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...