Jump to content

AstralWanderer

Members
  • Posts

    155
  • Joined

  • Last visited

Everything posted by AstralWanderer

  1. <p> In some ways, you've answered your question - market barely there and not worth the risk (especially with the current economic climate). I have a Dell 30" monitor (3007WFP-HC) with 2560x1600 resolution, so could be considered a likely candidate for a large super hi-res monitor. However I'm more than happy with the image quality and the combination of ClearType (for text) and AA (for games) deals with any visible pixellation. I'd need more than a modest boost in resolution/DPI to consider a monitor/graphics card upgrade (two cards since I use SLI) likely to cost several hundred pounds. The other side of the coin is that display innovation always starts small - OLEDs came to phones/MP3 players first before migrating upwards to laptops and we're seeing the same with "retina" displays. Now fair do's to Apple, they've implemented a new piece of hardware before anyone else (with their top-to-bottom control of software and hardware they're in the best position to) as they did with USB connectivity on the original iMac - and that will likely percolate through to the PC market in time. But the cost will be high and the benefits small, save for those involved in pro image processing. And even Apple have had to start small with this - I doubt we'll be seeing "retina" on an Apple desktop for a year or so yet. As for the comparison between hard discs and solid state storage, I don't know where you got your figures from but the current cost for a 4TB hard drive is currently about £0.046/GB while the largest SSD I could find (1TB) costs £1.70/GB - nearly 50 times the cost. In which case, how does this differ from selectable resolution which most games offer currently? It would indeed be better to stick with resolution as that is a term more familiar to most gamers than DPI - or if in-game scaling is to be offered, just choosing the scaling factor/algorithm. What this seems to come down to is asking for the highest quality renders and using the display's native resolution so the game engine can control upscaling/filtering. That seems like the best approach for current and future PC systems.
  2. Getting back to the OP, for those disinclined to wade through 220 pages of Feargus' comments, the one in question can be found here. Some nice analysis, and while it saddens me to see a publisher sinking so low (effectively panhandling), it doesn't surprise me either.
  3. I'd support "BG-style" VO also. The only improvement I would suggest is having more generic phrases recorded for each NPC, for greater variety in "non-voiced" dialogue. Another benefit of partial-VO is that it makes expansions and mods easier - the first NWN2 expansion (Mask of the Betrayer) brought in 2 NPCs from the original NWN2 campaign (Jerro and Bishop) and neither worked well IMHO (Jerro was completely silent while Bishop had a different voice).
  4. While I agree that pre-rendered artwork should be done at as high a resolution as possible, focusing on DPI exclusively would be a mistake - it would result in scenery elements being rendered at the same (real) size onscreen regardless of the device (e.g. a statue appearing 5cm tall, regardless of whether the device used is a smartphone or a 30-inch monitor). Instead, I would argue in favour of the system used by many games currently, where you specify a resolution and the game engine downsamples its images to best suit that resolution (if zoom is available, that downsampling will need to be at the highest zoomable resolution). Mstark's point about upscaling resulting in a blurry picture (in most cases) is valid - but the cause of this is not the upscaling itself, but the filtering applied afterwards by the monitor or graphics card driver. Such filtering is often beneficial with non-integer upscaling (to smooth out otherwise uneven pixel distribution) but should be optional for integer scaling. Unfortunately, most monitors and graphics drivers do not provide an option to disable or adjust such filtering. A workaround for this is for the game engine to always render at the display's native resolution and take care of upscaling itself (Dosbox is a good example of this - set its window display to native resolution and the renderer to openglnb and no filtering is applied, resulting in sharp, clear displays of older games). It can then offer the choice of scaling/filtering algorithm to suit users' tastes (some may prefer some smoothing to 2x2 or 3x3 pixel rendering). Such scaling could affect performance but this should be marginal, and would seem be the best method of "future proofing" a game for higher resolutions. I'm sorry to have to dredge this up from so many posts back, but to suggest a monitor manufacturer "conspiracy" against higher resolution devices is highly speculative. A more likely reason is technical - DVI-D reached its bandwidth maximum at 2560x1600x60fps and a new video connection standard has to be agreed between monitor and graphics card manufacturers before higher resolutions can be offered (the current anointed successor to DVI-D, DisplayPort, only offers a modest boost - even the 4-lane version maxes out at 2560x1600x120fps). Cross-sector co-operation on new standards (sadly) tends to lag high-end consumer demand - previous examples include removable media (no standard agreed after 1.44MB floppies), expansion buses (ISA lasting 5+ years longer than it should have before VL-Bus and PCI displaced it) and, going back to displays again, SuperVGA standards (where games spent almost a decade unable to progress beyond MCGA's 320x200 256-colour graphics due to proprietary implementations of 640x480 and higher resolutions). So this is more fairly described as screwup rather than conspiracy - on the other hand, it has meant high-end monitors having a useful lifespan longer than any other computer peripheral.
×
×
  • Create New...