Solviulnir the Soulbinder Posted October 29, 2012 Share Posted October 29, 2012 Am I the only one who doesn't mind playing IE games in their native resolutions? Ad rem: times change, in the present day games that have 7-10GB don't raise any eyebrows. Some of them pack more than 15GB... Hell, even patches today have greater volume than BG1 and BG2 taken together. Terabyte hard drives are nothing unusual and their capacity will grow every year. Having said that, and bearing in mind that the compression algorithms for both 2d images and 3d asset files are slightly more efficient than those used circa 10 years ago, I'd say that demanding from developers to create a *huge* game and fit it on one CD is... well... I'm cool with PE being distributed on 2 Blue-Rays. That way at least I'm sure that the developers aren't forced to cut any content from the game. Don't get me wrong, I'm all for space-efficient development and it would be awesome if Obsidian made PE as lightweight as Kkrieger for example. But let's face it, it will most probably not be a small game (which is good). As for future-proofing the game, I just hope that Obsidian's backup files of PE source code/assets are better secured in the future than those of Black Isle's for their BG games. Link to comment Share on other sites More sharing options...
Adhin Posted October 29, 2012 Share Posted October 29, 2012 They've said they plan to handle the picture size based off a 1080, 18:9, going off a ratio close to IWD2 and BG2. That being the image they released, is just that section of that map which is probably a good 3 times the width and height, or roughly around that. It should be good plus compression of massive images scales extremely well for installation stuff. Either way think of that image as a screen shot (because it is) not a full map image. Def Con: kills owls dead Link to comment Share on other sites More sharing options...
AndreaColombo Posted October 29, 2012 Share Posted October 29, 2012 I can't think of an isometric RPG that has come on 6 CDs, Baldur's Gate comes on 5 CDs, Baldur's Gate II 4CDs, Diablo II 4CDs, Icewind Dale 2CDs, Icewind Dale II 2CDs, Arcanum 2CDs, Planescape: Torment 1CD, Diablo 1CD. Baldur's Gate (5 CDs) + Tales of the Sword Coast (1 CD) = 6 CDs "Time is not your enemy. Forever is." — Fall-From-Grace, Planescape: Torment "It's the questions we can't answer that teach us the most. They teach us how to think. If you give a man an answer, all he gains is a little fact. But give him a question, and he'll look for his own answers." — Kvothe, The Wise Man's Fears My Deadfire mods: Brilliant Mod | Faster Deadfire | Deadfire Unnerfed | Helwalker Rekke | Permanent Per-Rest Bonuses | PoE Items for Deadfire | No Recyled Icons | Soul Charged Nautilus Link to comment Share on other sites More sharing options...
Starglider Posted October 30, 2012 Share Posted October 30, 2012 Support for ultra-high-resolution displays does not have to be anything more than 'upscale map & GUI texture using a high quality interpolation algorithm'. That is already better than simply running at a lower (non-native) res on your monitor, which is frankly already fine for a game like this. I say this as an owner of two 2560 x 1600 screens; for old Infinity Engine games I run at 1280 x 800 and let the monitor scaler double the pixels. Sure if the devs are supersampling the backgrounds anyway then keeping the high-res masters around for possible future ultra-HD releases seems sensible, but a seperate dedicated render sounds like another unwanted boat-anchor in a very packed dev schedule. Windhaven : fantasy flight adventure : now on Steam Greenlight Link to comment Share on other sites More sharing options...
Adhin Posted October 30, 2012 Share Posted October 30, 2012 Not to mention they're not 'just' rendering out. They use a mix of pre-render and hand painted details. They showed a making of example of that screen shot, 3D untextured, textured and rendered out, then how it looks after they did some 2D painting passes to liven it up a bit more. All that moss and some other details just didn't exist in the 3D rendering... they'd have to render is ultra-HD to do that then shrink down for majority of people out there, and that can lose some of what they would be going for if they where working off the higher end stuff. Either way its not as simple as just rendering out a larger image and keeping 2 versions around. Def Con: kills owls dead Link to comment Share on other sites More sharing options...
D3xter Posted December 19, 2012 Author Share Posted December 19, 2012 I guess we got a definitive confirmation/answer on this (1280x720 render for low-res screens and 2560x1440 for high-res screens): http://www.kickstarter.com/projects/obsidian/project-eternity/posts/371907 And while I'm somewhat disappointed and was hoping for 4K on the high side I guess it's understandable. It's just so damned sad with the "4K revolution" finally in full swing. For TVs they finally decided in a "marketing term": http://www.engadget.com/2012/10/20/cea-officially-brands-4k-as-ultra-hd-gets-ready-for-a-flood-of/?m=false SONY is releasing its first 10 movies in 4K along with their new TV: http://blog.sony.com/sony-4k-tv-content And on the PC side we'll soon see such wonderous things as 16~17" Laptops with a 3840x2160 screen at CES in but 3 weeks and other display vendors starting to sell monitors at similar resolutions very soon: http://www.engadget.com/2012/11/27/sharp-pn-k321-4k-igzo-lcd-monitor/ This was a nice program on 4K/8K and the future: http://news.bbc.co.uk/2/hi/programmes/click_online/9774380.stm Link to comment Share on other sites More sharing options...
LadyCrimson Posted December 19, 2012 Share Posted December 19, 2012 Such monitors might be available before too long, but I think the time when the majority of people will afford/have them (and thus create an actual serious need for games to accommodate them) is still a while off. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts Link to comment Share on other sites More sharing options...
D3xter Posted December 19, 2012 Author Share Posted December 19, 2012 Such monitors might be available before too long, but I think the time when the majority of people will afford/have them (and thus create an actual serious need for games to accommodate them) is still a while off. Well, I believe differently since most manufacturers are gearing up to "Hype Mode" as if it's the "next thing" and starting to really push for it. By this time next year we will likely see some affordable models and people buying (Higher End) Laptops won't be able to avoid such for long either At the point Project: Eternity will come out (if we're realistic some time in late 2014 or even into 2015) I'm pretty sure that kind of resolution won't be seen as such a rarity anymore Link to comment Share on other sites More sharing options...
LadyCrimson Posted December 19, 2012 Share Posted December 19, 2012 I think such changes don't become the "norm" until you can buy one for about $300 (monitor price I mean, not laptop). Either that or the economy is having a big boom era, which it's not. So...I'll give a wild prediction that it'll be more like 5 years before it even comes close to being as common as 1080 is now. Most people I know still don't even have 1080 laptops ... because they don't want to buy one. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts Link to comment Share on other sites More sharing options...
Falkon Swiftblade Posted December 19, 2012 Share Posted December 19, 2012 I was referring to the size of the viewport/screen space of course and not the size of the entire area. If you take that Icewind Dale map for instance at 3840x2880 for the entirety of an area and try to split it in single "screens", assuming the standard intended size was 640x480, you get: 6x width and 6x height = 36 screens Translating that to an intended standard size of 2560x1440 per screen for Project: Eternity while keeping the aspect ratio and size of a screen segment the same for gameplay reasons, the entire map would be 15360x8640 in size. Rendered at a 4K resolution or 3840x2160 per screen, the entire map would be 23040x12960 It's obviously a bit harder to work at those resolutions and I wouldn't know what tools or techniques they are using, but I assume they either segment the "map" into many different parts and put it together after those are finished or they are directly working on its entire size. To realize the sizes of images we are talking about, here for instance is an 8000x8000 image of earth: http://www.flickr.co...in/photostream/ And here's a image at 14331×8061: http://www.velopelot...2--high-res.jpg Also a Mars Panorama at 23123×7034: http://photojournal....eg/PIA10230.jpg for the record one dev mentioned about a week ago they're aiming for 20k px maps. Link to comment Share on other sites More sharing options...
AwesomeOcelot Posted December 19, 2012 Share Posted December 19, 2012 The Steam Hardware Survey shows that a majority of gamers will be able to support 1280x720 in 2014. It's almost 2013, the adoption of 1920x1080/1920x1200 is ~33%. I do not believe that will be the case for 3840×2160 in 2014. 2560x1440, 2560×1600 by 2015 might reach ~33% but only if mid range laptop ranges adopt those resolution. Link to comment Share on other sites More sharing options...
mstark Posted December 19, 2012 Share Posted December 19, 2012 For those worrying about this game not supporting 4k resolution, I just posted something in the update thread that should make it fairly clear what I believe Sawyer tried to say in his post. It's assumptions for now, hoping to get it confirmed. Either way: it'll all depend on the DPI of said 4k screen, but it does sound like the second set of assets that Sawyer mentioned, rendered at twice the resolution, will indeed be tailored to scale well to 24" screens at 4k (~185dpi). "What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?" Link to comment Share on other sites More sharing options...
D3xter Posted December 19, 2012 Author Share Posted December 19, 2012 (edited) I think such changes don't become the "norm" until you can buy one for about $300 (monitor price I mean, not laptop). Either that or the economy is having a big boom era, which it's not. So...I'll give a wild prediction that it'll be more like 5 years before it even comes close to being as common as 1080 is now. Most people I know still don't even have 1080 laptops ... because they don't want to buy one. Except that if everything goes according to plan they would have discarded most low-resolution displays by then. In 2013 they want to introduce ~4K to the High End market segment, in 2014 to the "Middle Class" and by 2015/2016 there should barely be any screens on the market at a normal monitor size that will sell below 4K, only old ones that haven't been replaced yet. They've been sitting on the sidelines on higher resolutions (being ambivalent to them) for a long while, but now they really want to do a "push" for that, since manufacturers also expect to increase their sales as people adopt the new standards. For those worrying about this game not supporting 4k resolution, I just posted something in the update thread that should make it fairly clear what I believe Sawyer tried to say in his post. It's assumptions for now, hoping to get it confirmed. Either way: it'll all depend on the DPI of said 4k screen, but it does sound like the second set of assets that Sawyer mentioned, rendered at twice the resolution, will indeed be tailored to scale well to 24" screens at 4k (~185dpi). Except you are still not making much sense, "Honestly, looking at resolution really doesn't make much sense. DPI/PPI is what matters.". Seriously? Again "DPI" doesn't exist as a thing with monitors and resolution is absolutely what matters xD Downscaling the 2560x1440 art set for lower resolutions will always produce acceptable, if not even better results. Here for instance are examples of using it on 3D games that leads to a clearer/better image and significantly less aliasing artifacts even on lower resolution screens leading to a better IQ: http://forums.anandt...d.php?t=2252699 http://www.pcgamesha...r-29551-817462/ http://forums.guru3d...ad.php?t=346325 That's probably why they will use 2560x1440 to scale down to 1920x1080, 1650x1050 and similar and will use the 720p art sets for 1280x720 and everything below, possibly aswell for Netbooks and similar using 1366 x 768. What they have announced is that they will have a mode to allow for upscaling the 2560x1440 artsets to future resolutions "into space" as he put it, but upscaling will always blur the image and make it look worse on higher resolution screens, since there ain't enough pixel data to put together a proper (detailed) representation of the high resolution image due to lack of detail. If anything there's only Interpolation filters that help (or make it even worse) depending on how you look at it. Regarding the UI, they will likely work with something like relative positionining that will allow it to display properly on even different Aspect Ratio screens with more or less Pixel Data displayed than the intended medium. Edited December 19, 2012 by D3xter Link to comment Share on other sites More sharing options...
LadyCrimson Posted December 19, 2012 Share Posted December 19, 2012 They've been sitting on the sidelines on higher resolutions (being ambivalent to them) for a long while, but now they really want to do a "push" for that, since manufacturers also expect to increase their sales as people adopt the new standards. That's fine, but it all depends on what the consumer ends up doing, doesn't it? They can plan and push all they want, but it doesn't mean people will buy. Think of all the people who refused to switch to Vista (or even Win7) from XP forever and ever and ever. I know that's not the same, but the point is ... you might be surprised how many consumers are tired of buying (when they still have something that works "fine"), especially when their wallets are pinched. Anyway, I'm not against it happening faster or anything. But those tech journals/articles/conventions/innovators/companies etc. can predict and push all the time ... doesn't mean it'll happen that fast in real world application. You'll have some tech-geeks who will buy fast, but the "masses" typically are a lot slower to make changes like this. That's all I was saying. It's not buying the latest iPad or iphone for cool points. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts Link to comment Share on other sites More sharing options...
D3xter Posted April 23, 2013 Author Share Posted April 23, 2013 (edited) First affordable UHD/4K display at 50" being sold for between $1200-$1500: http://hdguru.com/first-look-at-the-seiki-se50uy04-affordable-ultra-hdtv/ One of the first tests of the display by PC Perspective is done using games like Battlefield 3, Crysis 3, Skyrim and Tomb Raider which can already render at that resolution: http://www.amazon.com/Seiki-Digital-SE50UY04-50-Inch-120Hz/dp/B00BXF7I9M/http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=7674736 If the Chinese are going to also get into the PC monitor market this aggressively with even lower prices, a mass market push for higher resolution might happen even earlier than I thought. Edited April 23, 2013 by D3xter Link to comment Share on other sites More sharing options...
AwesomeOcelot Posted April 23, 2013 Share Posted April 23, 2013 (edited) It's a TV, it would make a terrible PC monitor, it's only 30hz, if I was spending that much 120hz would be more important to me. Also in terms of PPI it's not going to be different to a 24" monitor at 1080p, higher resolution in PC monitors doesn't mean 4K like in TV, it means higher pixel density than what we have now. For a PC monitor a 27" 1440p 60hz for half the price would make much more sense. It's going to take a while for mid range GPUs to render 4K at 60 frames, thus there's not enough demand for 4K PC monitors to be made. Edited April 23, 2013 by AwesomeOcelot Link to comment Share on other sites More sharing options...
D3xter Posted April 24, 2013 Author Share Posted April 24, 2013 (edited) It's a TV, it would make a terrible PC monitor, it's only 30hz, if I was spending that much 120hz would be more important to me. Also in terms of PPI it's not going to be different to a 24" monitor at 1080p, higher resolution in PC monitors doesn't mean 4K like in TV, it means higher pixel density than what we have now. For a PC monitor a 27" 1440p 60hz for half the price would make much more sense. Well obviously it's a TV, it's also marketed as a TV and isn't meant as a monitor per se, I was just saying they were testing it with games, since gaming provides some of the largest amount of content out there for higher resolution displays aside from the available test clips, a bunch of YouTube videos and some test broadcasts right now. I'm also using a TV as a computer monitor from time to time to play games on with a Controller (latest being Tomb Raider and Brütal Legend) so I wouldn't presume too much, but not for Keyboard&Mouse games like Project Eternity is obviously going to be. UHD content is at about the stage normal HD content was at in ~2004/2005 with lots of Demos and stuff and TVs being shown off and the first few movies being released. The difference this time around is that the PC monitor market seems to be behind the curve, while the mobile sector is a mile ahead and I believe there'll be a quicker adoption/transition to the new standards over time. And it is only 30Hz for 3840x2160 because it is a first barebones model designed to be as cheap as possible and HDMI 1.4 isn't specified for that kind of bandwidth, not for any sort of inherent hardware limitation by the TV. HDMI 2.0 should allow for 60Hz and will be standard in other models by the end of this year: http://en.wikipedia.org/wiki/HDMI#Version_2.0 Other solutions in the meantime that some of the more expensive displays are using involve DisplayPort 1.2: http://en.wikipedia.org/wiki/DisplayPort#1.2 http://www.techspot.com/news/51519-vesa-updates-displayport-dual-mode-pushes-4k-uhd-over-hdmi.html It's going to take a while for mid range GPUs to render 4K at 60 frames, thus there's not enough demand for 4K PC monitors to be made. You'd be surprised. Additional PC monitors like the already mentioned SHARP and several ViewSonic models will be coming out throughout 2013 with the possibility of Dual-Link DVI or DisplayPort 1.2: http://www.brightsideofnews.com/news/2013/3/4/the-state-of-4k-and-4k-gaming-early-2013.aspx but they would obviously still cost quite a sum and are only meant for the high-end market. I was just theorizing about the possibility of PC monitors from Chinese manufacturers like Seiki using 4K coming out later this year at competitive pricing of possibly ~$500-600, pushing for even earlier market adoption in the middle and low-class sectors not only for TVs, but PC monitors too. Growing content distribution methods, some already available and more launching later this year and early 2014, especially the PS4 and the first official broadcasts will provide for increased demand. http://www.theverge.com/2013/1/7/3848924/sony-launching-4k-video-distribution-service http://www.theverge.com/2013/2/28/4040932/sony-4k-movie-service-will-work-with-ps4-require-100gb-plus-downloads http://www.gizmag.com/red-odemax-4k-distribution-network-redray/25267/ http://www.reuters.com/article/2013/01/27/us-japan-hdtv-idUSBRE90Q02520130127 They'll likely come along with the increasing Implementations of H.265/HEVC for video: http://en.wikipedia.org/wiki/High_Efficiency_Video_Coding#Implementations which is supposed to improve Image Quality and about double the compression ratio in comparison with H.264/AVC and replace it within the next few years. Edited April 24, 2013 by D3xter Link to comment Share on other sites More sharing options...
D3xter Posted July 28, 2013 Author Share Posted July 28, 2013 (edited) Here we go again, another Chinese company (TLC) is offering a 50" 4K TV for $999: http://www.engadget.com/2013/07/25/tcl-4k-50-inch-999/ Seiki is also launching a 39" 4K TV at $699: http://www.engadget.com/2013/06/25/seiki-launches-39-inch-4k-tv-for-699/ and are expecting a 65" model by Fall. According to this: (about 11 minutes in)the panel of the TV can apparently also manage up to 120Hz @ 4K, but the electronics board (and obviously the connection used) can't manage it. I hope they'll also enter the PC/computer monitor market with a few models later this year, but the moment when a lot of new models will sprout is likely at the end of the year when HDMI 2.0 is finally out and can be used for the upcoming 4K TVs and monitors: http://www.audioholics.com/education/display-formats-technology/hdmi-2.0-specification Edited July 28, 2013 by D3xter Link to comment Share on other sites More sharing options...
alanschu Posted July 28, 2013 Share Posted July 28, 2013 Given that I'm already pretty indifferent on the difference between 720p and 1080p for basic television, I'm curious how much I'll care. On a computer, my love of resolution is typically associated with the amount of desktop space I get out of it. However, that also requires me to have a large enough monitor to prevent things from getting too small. I actually stayed at 1024x768 even though my monitor could support 1600x1200 for a loooooooong time until I finally got a 19" CRT. I was happy with my 19" CRT for a long time, until work gave me a 24" flatscreen, to which I had to upgrade. Though even at 1024x768, aliasing was rarely something that ever bothered me. It might now (now that I am so used to 1920x1200), but I actually stuck with my CRT for as long as I did because I wanted the flexible resolutions for gaming (since frame rate was significantly more important for me), but I *do* notice obvious image degradation when playing on an LCD's non-native resolution. I imagine non-native resolution becomes less of an issue once the pixels get so small, but I wouldn't be surprised if I experience some level of push back, personally, until I have hardware that can support those resolutions at high frame rates and detail levels. Since I just bought a new computer, this is likely at least 3-4 years out for me. Link to comment Share on other sites More sharing options...
Sensuki Posted July 28, 2013 Share Posted July 28, 2013 (edited) I think the way they're doing it is fine. As long as they don't lose the art assets they can always go back and release a UHD maps pack at some point in the future or whatever. Edited July 28, 2013 by Sensuki Link to comment Share on other sites More sharing options...
ravenshrike Posted July 29, 2013 Share Posted July 29, 2013 Considering the best 4k monitors on the market have yet to dip below 8ms grey to grey response time and none of them are over 60FPS and probably won't be for the next 10 years, I think there's still time. And anybody who says the human eye can't distinguish between 60FPS and 80+ is woefully ill-informed. The brain can sense changes between 20 and 300fps depending on how much 'processing power' is available. Now, for low to mid intensity gaming, anything over 60FPS isn't needed because your brain doesn't need to tell the difference. However in any games that rely on 'twitch gaming' such as QIIIA or high level Tetris 100+FPS is distinctly beneficial. "You know, there's more to being an evil despot than getting cake whenever you want it" "If that's what you think, you're DOING IT WRONG." Link to comment Share on other sites More sharing options...
Elvin Rath Posted August 2, 2013 Share Posted August 2, 2013 There's still time, sure. But if their backgrounds are in larger resolutions... Why not? It would be great if they think about this right now. Yeah, It may take up to 15 years that most gamers have 4/8K screens, but hey, It's been 15 years since Baldur's Gate release and there's still people playing it. And original sources are usually lost after some time, so I'll think It would be great to do it now. In fact I hope that it takes less than 10 years, I'm a bit tired of useless 1080p screens for 4.x inches screens while most monitors are stuck on less than 100 PPI. 1 Link to comment Share on other sites More sharing options...
D3xter Posted September 6, 2013 Author Share Posted September 6, 2013 HDMI 2.0 has finally officially been revealed in time for IFA 2013: http://www.engadget.com/2013/09/04/hdmi-2-0-official-4k-60fps-32-channel-audio/ Only just after it leaked out, the folks at HDMI Licensing are announcing HDMI 2.0 officially. Arriving just in time for the wide rollout of a new generation of Ultra HDTVs, it adds a few key capabilities to the connection standard. With a bandwidth capacity of up to 18Gbps, it has enough room to carry 3,840 x 2,160 resolution video at up to 60fps. It also has support for up to 32 audio channels, "dynamic auto lipsync" and additional CEC extensions. The connector itself is unchanged, which is good for backwards compatibility but may disappoint anyone hoping for something sturdier to support all of those suddenly-popular dongles. The cables won't change either, as the group claims current high-speed Category 2 wires can handle the increased bandwidth. Some companies have suggested upgrade paths for their UHDTVs already on the market -- hopefully we'll find out more about those plans this week at IFA 2013. Many other hardware manufacturers were ready with showings of new 4K TVs and higher resolution PC hardware: http://recombu.com/digital/news/4k--ifa-2013-samsung-sony-lg-panasonic-philips-and-toshiba-show-off_M12097.html http://www.pcadvisor.co.uk/new-product/tablets/3466930/panasonic-4k-tablet-specs-release-date/ Cool commercial. Link to comment Share on other sites More sharing options...
samm Posted September 6, 2013 Share Posted September 6, 2013 And again, the concept of HDMI is complete crap even in its 2.0 version. Supporting 32 audio streams but only 60Hz @4k? Probably to save bandwith/processing power for HDCP... and for idiotic new features, auto lipsync, wtf do these things technically have to do with a *display connection* standard? Nothing. And all this comes with license fees. I'd say the consortium needs more techies and less marketing and lobby persons. Improving DP further would be the better route to take. 1 Citizen of a country with a racist, hypocritical majority Link to comment Share on other sites More sharing options...
mstark Posted September 7, 2013 Share Posted September 7, 2013 (edited) And again, the concept of HDMI is complete crap even in its 2.0 version. Supporting 32 audio streams but only 60Hz @4k? Probably to save bandwith/processing power for HDCP... and for idiotic new features, auto lipsync, wtf do these things technically have to do with a *display connection* standard? Nothing. And all this comes with license fees. I'd say the consortium needs more techies and less marketing and lobby persons. Improving DP further would be the better route to take. Well spoken. Unfortunately TVs and surround comes first, monitors second. Edited September 7, 2013 by mstark "What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?" Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now