Jump to content

AstralWanderer

Members
  • Posts

    154
  • Joined

  • Last visited

Everything posted by AstralWanderer

  1. Getting back to the OP, for those disinclined to wade through 220 pages of Feargus' comments, the one in question can be found here. Some nice analysis, and while it saddens me to see a publisher sinking so low (effectively panhandling), it doesn't surprise me either.
  2. I'd support "BG-style" VO also. The only improvement I would suggest is having more generic phrases recorded for each NPC, for greater variety in "non-voiced" dialogue. Another benefit of partial-VO is that it makes expansions and mods easier - the first NWN2 expansion (Mask of the Betrayer) brought in 2 NPCs from the original NWN2 campaign (Jerro and Bishop) and neither worked well IMHO (Jerro was completely silent while Bishop had a different voice).
  3. While I agree that pre-rendered artwork should be done at as high a resolution as possible, focusing on DPI exclusively would be a mistake - it would result in scenery elements being rendered at the same (real) size onscreen regardless of the device (e.g. a statue appearing 5cm tall, regardless of whether the device used is a smartphone or a 30-inch monitor). Instead, I would argue in favour of the system used by many games currently, where you specify a resolution and the game engine downsamples its images to best suit that resolution (if zoom is available, that downsampling will need to be at the highest zoomable resolution). Mstark's point about upscaling resulting in a blurry picture (in most cases) is valid - but the cause of this is not the upscaling itself, but the filtering applied afterwards by the monitor or graphics card driver. Such filtering is often beneficial with non-integer upscaling (to smooth out otherwise uneven pixel distribution) but should be optional for integer scaling. Unfortunately, most monitors and graphics drivers do not provide an option to disable or adjust such filtering. A workaround for this is for the game engine to always render at the display's native resolution and take care of upscaling itself (Dosbox is a good example of this - set its window display to native resolution and the renderer to openglnb and no filtering is applied, resulting in sharp, clear displays of older games). It can then offer the choice of scaling/filtering algorithm to suit users' tastes (some may prefer some smoothing to 2x2 or 3x3 pixel rendering). Such scaling could affect performance but this should be marginal, and would seem be the best method of "future proofing" a game for higher resolutions. I'm sorry to have to dredge this up from so many posts back, but to suggest a monitor manufacturer "conspiracy" against higher resolution devices is highly speculative. A more likely reason is technical - DVI-D reached its bandwidth maximum at 2560x1600x60fps and a new video connection standard has to be agreed between monitor and graphics card manufacturers before higher resolutions can be offered (the current anointed successor to DVI-D, DisplayPort, only offers a modest boost - even the 4-lane version maxes out at 2560x1600x120fps). Cross-sector co-operation on new standards (sadly) tends to lag high-end consumer demand - previous examples include removable media (no standard agreed after 1.44MB floppies), expansion buses (ISA lasting 5+ years longer than it should have before VL-Bus and PCI displaced it) and, going back to displays again, SuperVGA standards (where games spent almost a decade unable to progress beyond MCGA's 320x200 256-colour graphics due to proprietary implementations of 640x480 and higher resolutions). So this is more fairly described as screwup rather than conspiracy - on the other hand, it has meant high-end monitors having a useful lifespan longer than any other computer peripheral.
×
×
  • Create New...