Jump to content

Recommended Posts

Posted

I can see why people are stressing this when looking back at the old games that we all love stuck at 640x480, etc. Especially when looking at the tech coming out this year. Especially, when looking at VR. There is still talk about Oculus Rift having 4k screens for each eye. Which is ridiculous, but could honestly be the new standard in just a couple of years. So, now you think about having to render 2x4k screens(potentially one for each eye) for the 3d aspect.

 

I think this will all be incredibly cool, and it will be a shame that we likely wont be able to play these classic games in this way.

Posted

Virtual Reality sets aren't very optimal at first place for games that don't use first person perspective, as their idea is to give player very high spatial immersion (that feeling that you are in the game). From immersion perspective games like PoE focus more on tactical (is experienced when performing tactile operations that involve skill. Players feel "in the zone" while perfecting actions that result in successl), strategic (is more cerebral, and is associated with mental challenge. Chess players experience strategic immersion when choosing a correct solution among a broad array of possibilities.) and narrative  (occurs when players become invested in a story, and is similar to what is experienced while reading a book or watching a movie.) aspects of immersion. 

  • Like 1
Posted

I can see why people are stressing this when looking back at the old games that we all love stuck at 640x480, etc. Especially when looking at the tech coming out this year. Especially, when looking at VR. There is still talk about Oculus Rift having 4k screens for each eye. Which is ridiculous, but could honestly be the new standard in just a couple of years. So, now you think about having to render 2x4k screens(potentially one for each eye) for the 3d aspect.

 

I think this will all be incredibly cool, and it will be a shame that we likely wont be able to play these classic games in this way.

 

a) That's highly optimistic given the state of GPU performance and the Oculus Rift, the 1080p version isn't out yet. It's possible the highest end GPU SLI/Crossfire could render games at 3D 4k in 2years, but it's not going to be the new standard.

 

b) This is a cavalier oblique perspective game with 2D backgrounds, it's not a shame that there won't be VR support, it's not something that benefits from it at all and  given the 2D backgrounds it's infeasible. They have 5 maps for the background, they're already large, they'd have to double that for each eye for 3D.

 

c) I predict VR sets will support viewing 2D games, they'll just look bad and it won't make sense.

Posted

I get that it's a 2d game, but that doesn't mean that displaying it in 3d wouldn't at least add some depth to the image. Not so much for being able to look around in the environment or anything. Think of what Avatar did in 3D in theatres. It wasn't about things popping out at you so much as it was about the depth of the image(imo).

 

And, I wouldn't be so sure about VR not becoming the norm in just a few short years. Look at how fast smartphones and tablets took off.

 

I think that VR and 4k are good reason to get Intel, Nvidia and AMD to really start pushing tech again. Thanks to all of the multiplatform support in games in the past 7+ years, it's been hard to find a reason to upgrade our systems like the old days. Hell, I think I've had my CPU/motherboard/RAM for almost 7 years now, which is crazy for me. But when you can play everything at max settings still, why upgrade?

Posted

I get that it's a 2d game, but that doesn't mean that displaying it in 3d wouldn't at least add some depth to the image. Not so much for being able to look around in the environment or anything. Think of what Avatar did in 3D in theatres. It wasn't about things popping out at you so much as it was about the depth of the image(imo).

 

And, I wouldn't be so sure about VR not becoming the norm in just a few short years. Look at how fast smartphones and tablets took off.

 

I think that VR and 4k are good reason to get Intel, Nvidia and AMD to really start pushing tech again. Thanks to all of the multiplatform support in games in the past 7+ years, it's been hard to find a reason to upgrade our systems like the old days. Hell, I think I've had my CPU/motherboard/RAM for almost 7 years now, which is crazy for me. But when you can play everything at max settings still, why upgrade?

a) You weren't talking about 3D, you were talking about VR. There are 3D screens, and you can add depth to games like StarCraft 2 with 3D, but it takes more effort and resources with a game like Pillars of Eternity.

 

b) I didn't even think you were talking about VR becoming the norm in 2 years, I thought you meant 4k would be the norm in VR, that's an even more ridiculous prediction.

 

c) Intel, Nvidia, and AMD never stopped progressing, year on year, generation on generation, the increase of FLOPS, bandwidth, never stalled, there's also been some great advances in shader technology. You can't play games from 2011 at 1080p/over 60fps on max settings with a 7 year old PC, let alone games from 2013 at 1440p and above. Just under 7 years ago the top GPU was the GeForce 8800 Ultra, in 2012 a budget GPU, the lowest non-OEM Radeon had almost double the floating point performance.

Posted (edited)

I get that it's a 2d game, but that doesn't mean that displaying it in 3d wouldn't at least add some depth to the image.

but that's exactly what it means in this case.

 

whichever axonometric projection P:E will be using (and IE and any other '2D' top-down games have used) can be imagined as looking at objects infinitely far away; there is no difference in the angle that each of Your eyes are looking at the scene, so there is no difference in picture that each of Your eyes see, so there is nothing to create the illusion of depth.

Edited by sesobebo
Posted

 

a) You weren't talking about 3D, you were talking about VR. There are 3D screens, and you can add depth to games like StarCraft 2 with 3D, but it takes more effort and resources with a game like Pillars of Eternity.

 

b) I didn't even think you were talking about VR becoming the norm in 2 years, I thought you meant 4k would be the norm in VR, that's an even more ridiculous prediction.

 

c) Intel, Nvidia, and AMD never stopped progressing, year on year, generation on generation, the increase of FLOPS, bandwidth, never stalled, there's also been some great advances in shader technology. You can't play games from 2011 at 1080p/over 60fps on max settings with a 7 year old PC, let alone games from 2013 at 1440p and above. Just under 7 years ago the top GPU was the GeForce 8800 Ultra, in 2012 a budget GPU, the lowest non-OEM Radeon had almost double the floating point performance.

 

 

a) VR offers some of the best 3d available, that's why I'm using it for this discussion. Also, specifically for the possibility of having a 4k screen per eye, which is supposedly something that OculusVR has been working on.

 

b) If something really takes off (eg. tablets and smartphones). It doesn't take long for everyone to get one. Especially, if they're cheap. They said they wanted to have it released (base version) for about $300. I am referring to the norm for gamers and not all PC users of course. And, of course, this is pure speculation.

 

c) I have upgraded GPU's and went to a SSD in the past 7ish years, but that's it. And I can still run everything(that I've played) fine at 1080p at ultra settings. I haven't played the latest Crysis, but did play Bioshock Infinite lately, no problems. I haven't run benches or anything, cause I didn't visually notice any frame rate drops. And, I was just suggesting that the leaps haven't been nearly as big recently as they were years ago when Moore's law still applied. And, that I mainly blame that on games with multiplatform support holding back the PC.

 

Intel Core 2 Extreme QX9650 @3.0 GHz

8 GB RAM @ 800 MHz

EVGA 780i 3x SLI mobo

 

Might be a little over 6 years, but its been a long time. And, that stuff was pretty cutting edge when it came out.

Posted

 

I get that it's a 2d game, but that doesn't mean that displaying it in 3d wouldn't at least add some depth to the image.

but that's exactly what it means in this case.

 

whichever axonometric projection P:E will be using (and IE and any other '2D' top-down games have used) can be imagined as looking at objects infinitely far away; there is no difference in the angle that each of Your eyes are looking at the scene, so there is no difference in picture that each of Your eyes see, so there is nothing to create the illusion of depth.

 

 

Yeah, that makes sense. I was thinking that if they can make Civ 5 display in 3D, that it would work here too. But the way everything is rendered is quite different as you've pointed out. They'd have to have a second background rendered at a slightly different angle. Of course, that'd be cool too. :biggrin:

Posted

a) VR offers some of the best 3d available, that's why I'm using it for this discussion. Also, specifically for the possibility of having a 4k screen per eye, which is supposedly something that OculusVR has been working on.

 

b) If something really takes off (eg. tablets and smartphones). It doesn't take long for everyone to get one. Especially, if they're cheap. They said they wanted to have it released (base version) for about $300. I am referring to the norm for gamers and not all PC users of course. And, of course, this is pure speculation.

 

c) I have upgraded GPU's and went to a SSD in the past 7ish years, but that's it. And I can still run everything(that I've played) fine at 1080p at ultra settings. I haven't played the latest Crysis, but did play Bioshock Infinite lately, no problems. I haven't run benches or anything, cause I didn't visually notice any frame rate drops. And, I was just suggesting that the leaps haven't been nearly as big recently as they were years ago when Moore's law still applied. And, that I mainly blame that on games with multiplatform support holding back the PC.

 

Intel Core 2 Extreme QX9650 @3.0 GHz

8 GB RAM @ 800 MHz

EVGA 780i 3x SLI mobo

 

Might be a little over 6 years, but its been a long time. And, that stuff was pretty cutting edge when it came out.

a) VR doesn't offer the best 3D around, for 3rd person perspective games e.g. Civ 5 or Starcraft 2 I'd argue it's worse than a 3D screen. Oculus VR has software demos, they haven't been working on a 4K Oculus Rift, given that they've still yet to deliver a consumer version, or 1080p dev kits.

 

b) $300 for a 1080p version in late 2014 or 2015 at the earliest, they're not going to release a 4k version in 2 years, there wouldn't be a market for it even if they could, computers couldn't render 4k twice for high fidelity games of 2016. Tablets and smartphones replaced devices, laptops and older phones, VR is not a screen replacement, it's a niche peripheral that gives a unique experience.

 

c) Your claim was a lie. I mean your computer isn't even 7 years old, it's 6 years old. The most important component in rendering games is your GPU, you can't upgrade and claim your computer is 7 years old, it's however old the GPU is, CPU isn't that important in game performance for most games. Even with outright cheating, you may still not get a stable 60 fps in games from 2013, even if your GPU is quadruple the speed of the top GPU from 7 years ago.

 

The lack of "leaps" you claim is one of perception, you don't appreciate the equal jumps in polys for instance, and software, the poor games that don't push PC boundaries or are poorly optimised, the lack of API development, not the lack of hardware advancement. Multi-platform titles are partly to blame, and the existence of consoles.

Posted

I haven't personally tried the rift yet, so I really cant say for sure. But having a dedicated screen for each eye seems like it would make for the best 3d that you could get(imo). At least better then polarized or active shudder lenses(which I have tried). Since, I haven't personally tried it though, it's just optimism on my part.

 

After looking for a link for the 4k development article, I found another stating that the founder was misquoted in those articles. So, you are correct, there is currently no 4k rift in development.

 

And, yeah, I kinda described the advancement issue wrong. As you've stated, it's not so much the chip makers fault as it is the API's and software limitations. But, I do feel that software hasn't been pushing the hardware to make larger leaps and bounds as much as it could be due to the consoles. Luckily, games like Star Citizen are (hopefully) helping to change that.

Posted

 

Having that.. does win 8 already handle different resolutions with grace?

I gather the "square blocks" new interface does, but how about the "normal" desktop? And programs?

Desktop mode has "limited" scaling, the same scaling that Windows 7 employs. You can scale elements, but it causes fonts and graphics in most/all apps to turn blurry because they don't handle the scaling well. This is similar what happens/happened on retina MBP when it was released, it simply scaled everything by 2x and used higher resolution assets when available, eg. when apps had included them. I believe there's currently no way for x86 apps to detect what scaling Windows has been set to and load/scale assets accordingly, 8.1 should introduce this. For example, even though vector based, a 12px font won't magically be turned into a 24px font, it'll still be a 12px font scaled to twice the size unless an app supports scaling and intelligently detects this. It gets even more complicated when you start involving web browsers and their scaling behaviour.

 

Windows 8.1 is scheduled to include "enhanced DPI scaling", but few details have been released. It' probably something very similar to what the current retina MBP employs, but based on information released so far it'll allow more user customization.

 

 

..it's not /that/ complicated. I mean, generally we're talking about making sure background elements and foreground elements aren't scaled asynchronously. So you don't get one hilariously ugly shader effect next to very pretty water and so on. And then that UI and text are rendered in separate layers from the rest of the graphics context.

 

That's usually what kills older games, frankly, much more than simply rendering the entire thing in half-res compared to the screen you have. Because when you have complex composites with lots of drawing and different sized text, and so on, scaling up is always horrible. No matter if it's the method Apple uses for the MBP, or the semi-composite manager system they have in Windows -- when you start to scale elements that are supposed to be rendered in the same resolution as the screen, things are hard to look at and they break.

 

Avoid that, and you can still play the game on a 4k screen, no problem.

 

(I do wholeheartedly support the idea of having a 20 times as large render pass as the normal one as a piece of optional downloadable material, though :D)

The injustice must end! Sign the petition and Free the Krug!

  • 2 weeks later...
Posted (edited)

a) VR doesn't offer the best 3D around, for 3rd person perspective games e.g. Civ 5 or Starcraft 2 I'd argue it's worse than a 3D screen. Oculus VR has software demos, they haven't been working on a 4K Oculus Rift, given that they've still yet to deliver a consumer version, or 1080p dev kits.

b) $300 for a 1080p version in late 2014 or 2015 at the earliest, they're not going to release a 4k version in 2 years, there wouldn't be a market for it even if they could, computers couldn't render 4k twice for high fidelity games of 2016. Tablets and smartphones replaced devices, laptops and older phones, VR is not a screen replacement, it's a niche peripheral that gives a unique experience.

 

c) Your claim was a lie. I mean your computer isn't even 7 years old, it's 6 years old. The most important component in rendering games is your GPU, you can't upgrade and claim your computer is 7 years old, it's however old the GPU is, CPU isn't that important in game performance for most games. Even with outright cheating, you may still not get a stable 60 fps in games from 2013, even if your GPU is quadruple the speed of the top GPU from 7 years ago.

 

The lack of "leaps" you claim is one of perception, you don't appreciate the equal jumps in polys for instance, and software, the poor games that don't push PC boundaries or are poorly optimised, the lack of API development, not the lack of hardware advancement. Multi-platform titles are partly to blame, and the existence of consoles.

a) VR does offer the best 3D around, it seems painfully obvious that you haven't tried any of the Oculus Rift prototypes yet. It's true stereoscopic vision with the scene being rendered twice from the perspective of your left and right eye and is probably one of the most "natural" ways to do 3D.

 

He is wrong though, it's not "4K twice", it's 4K split in two. The first DevKit has a 1280x800 screen rendering 640x800 for each eye (minus the black space around your Field of View that your eyes won't be able to perceive through the lenses): http://i0.wp.com/www.roadtovr.com/wp-content/uploads/2013/07/crashland-oculus-rift-demo.jpg

The second DevKit will have a 1920x1080 OLED panel and thus render 960x1080 for each eye and they've already announced that the consumer version coming out will have a higher resolution. Likely 2560x1440 which Samsung announced to be working on. Samsung, LG and Japan Display are going to release 1440p mobile phone screens throughout 2014: http://www.digitaltrends.com/mobile/smartphone-pixel-screen-tech-guide/

For the Rift that would be 1280x1440 for each eye. It is a little more complicated than that though, since rendering the same scene with all the vertices twice from to distinct point of view (even at a lower resolution for each) will still put quite a strain on GPUs and it's not as simple as having a 1440p screen split in two.

 

There's ways to lower the rendering requirements, for instance by using the depth buffer to "fake 3D", but the results will be ****tier than rendering out twice: http://www.vorpx.com/more-headtracking-z-buffer-vs-geometry-3d/

 

b) The 1080p DK2 has already been unveiled at the last GDC and is to ship in July, the final product with (likely a 1440p screen) is most likely to show Q4 2014 or Q1 2015.

Of course, these were the plans before ****ing Facebook bought the company for $2 billion... http://www.engadget.com/2014/03/25/facebook-oculus-vr-2-billion/

 

VR is going to come really fast, I think he is a little optimistic with 2 years, since the consumer version of the Rift is just about to hit a year from now, but in 4-5 years it'll be largely established. Don't forget Sony is also doing VR: http://www.engadget.com/2014/03/18/playstation-virtual-reality/ and there will be many more big players after that buyout. The wonderful thing about it is that it just takes experiencing it to make one a believer and the naysayers won't have much to stand on once it hits mass market.

 

It doesn't make any sense to talk about VR in the case of Project: Eternity though, since whoever does so probably doesn't understand what "Pre-Rendered" means.

You could render something like Diablo III or Torchlight II for VR and make the player look from above like they've demonstrated for instance for "Couch Knights":

but a Pre-Rendered background is a still image that wouldn't change at all when someone changes perspective/looks around to the left or right or leans back or forward etc. It's impossible to do something like that in VR if you don't do it as a gimmick like the devs of "City Quest":

 

And the next generation of graphics cards will be able to render 4K just fine, the current one already manages for the largest amount of games but the very resource hungry ones: http://www.eurogamer.net/articles/digitalfoundry-nvidia-geforce-gtx-780-ti-review

 

c) His claim wasn't exactly a lie it was just imprecise, unfortunately not much has changed in the past few years requiring a upgrade to CPU, Mainboard or RAM. To a large part probably due to the popularity of gaming consoles.

Can only hope that Intel reverses course posthaste like they've said: http://techreport.com/review/26189/intel-to-renew-commitment-to-desktop-pcs-with-a-slew-of-new-cpus

 

The China-made 4K monitors I was talking about have finally arrived by the way, they're all sporting the same TN panel though with different features and backlights.

 

The ones announced and some purchaseable so far:

Samsung UD590 $699 @60Hz: http://www.pcworld.com/article/2137477/samsung-lowballs-the-4k-competition-with-700-display.html

Lenovo Thinkvision Pro2840m $799 @60Hz: http://www.cnet.com/products/lenovo-thinkvision-pro2840m-4k-display/

Dell P2815Q $699 @30Hz: http://www.forbes.com/sites/jasonevangelho/2014/01/07/dell-wasnt-joking-about-that-28-inch-sub-1000-4k-monitor-its-only-699/

Asus PB287Q $799 @60Hz: http://www.engadget.com/2014/01/06/asus-28-inch-4k-display/

Philips 288P6 $1199: http://www.digitalavmagazine.com/en/2014/01/10/Philips-288p6-monitor-at-ces-2014-ultraclear-panel-28-4-k/

 

The sad thing about all of this is that since Project: Eternity is still about a year away at best, it'll only take another 1-2 years after release till the rendering resolution will be considered "old" and "insufficient".

Edited by D3xter
Posted (edited)

VR does offer the best 3D around, it seems painfully obvious that you haven't tried any of the Oculus Rift prototypes yet.

Are there any "top-down" 3rd perspective demos for the prototype? There's probably a reason for that. Also it requires you wear headgear.

 

You could render something like Diablo III or Torchlight II for VR and make the player look from above like they've demonstrated for instance for "Couch Knights":

I doubt it, and that game does not look fun to play like that.

 

And the next generation of graphics cards will be able to render 4K just fine, the current one already manages for the largest amount of games but the very resource hungry ones:

I was specifically responding to the claim of rendering 4k TWICE. Also current gen gpu render 4k at sub 60fps from your own source, and they're quite expensive GPU.

Edited by AwesomeOcelot
  • 1 month later...
Posted

I would like to be able to see more area, personally, when using higher resolution monitors. Its a pet peeve of mine, when I use higher resolutions, but everything stays same size. BUT, having the option to do both is GREAT. Maybe a slight DPI scaling of 1.10- 1.25% but playing at 4k, or 8k res would be awesome.

Obsidian wrote:
 

​"those scummy backers, we're going to screw them over by giving them their game on the release date. That'll show those bastards!" 

 

 

 Now we know what's going on...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...