Sensuki Posted August 18, 2014 Share Posted August 18, 2014 There is no option to disable vertical sync in the loader (where you select the resolution) or the Graphics menu in the game, please add an option to disable vertical sync as it adds mouse lag, and is overall annoying. Is there any way we can disable it in a config file or somewhere for the time being? Thanks in advance 9 Link to comment Share on other sites More sharing options...
Zeckul Posted August 19, 2014 Share Posted August 19, 2014 (edited) Perhaps the option should indeed be there, but it should be on by default. Most players should have the game running at their display's refresh rate as it's not graphically demanding, so they'll benefit from the absence of tearing. Vsync adds no mouse lag if you're producing more frames per second than the refresh rate. Edited August 19, 2014 by Zeckul Link to comment Share on other sites More sharing options...
Sensuki Posted August 19, 2014 Author Share Posted August 19, 2014 (edited) Yes it does. I've been a top level competitive FPS player for 10 years (retired recently) and no_one plays with vertical sync on, because even if you are getting 333 FPS and have vsync on at 120Hz, there will still be mouse lag. I don't care what the default setting is, as long as you can turn it off. Edited August 19, 2014 by Sensuki 1 Link to comment Share on other sites More sharing options...
Zeckul Posted August 19, 2014 Share Posted August 19, 2014 (edited) Unless the game engine is doing some weird buffering, you should always be seeing the most recent frame. Your display can't be any closer to real-time than its refresh period (well, half of it on average), so you're can't get more up-to-date info than that. Another explanation for added latency with vsync would be that the game engine ties the simulation step to the graphics frame rate but that is uncommon these days. In and of itself vsync is not a mechanism that adds latency. Edited August 19, 2014 by Zeckul Link to comment Share on other sites More sharing options...
CatatonicMan Posted August 19, 2014 Share Posted August 19, 2014 (edited) Unless the game engine is doing some weird buffering, you should always be seeing the most recent frame. Your display can't be any closer to real-time than its refresh period (well, half of it on average), so you're can't get more up-to-date info than that. Another explanation for added latency with vsync would be that the game engine ties the simulation step to the graphics frame rate but that is uncommon these days. In and of itself vsync is not a mechanism that adds latency. V-Sync always introduces at least a single frame of lag, since it requires at least double buffering to function. Some games render multiple frames in advance, which only compounds the problem. Further, some games have seemingly terrible implementations of V-Sync that cause inexcusable amounts of lag. Add to that the tendency for V-Sync to cause extreme variations in FPS, and you get some serious issues. Edited August 19, 2014 by CatatonicMan Link to comment Share on other sites More sharing options...
Lychnidos Posted August 19, 2014 Share Posted August 19, 2014 (edited) You can always turn off all V-Sync with the AMD Catalyst Control Center or Nvidia Control Panel(I think this one can also turn it off per application) in the meantime, but I guess you already knew that. Edited August 19, 2014 by Lychnidos 1 Link to comment Share on other sites More sharing options...
Zeckul Posted August 19, 2014 Share Posted August 19, 2014 Unless the game engine is doing some weird buffering, you should always be seeing the most recent frame. Your display can't be any closer to real-time than its refresh period (well, half of it on average), so you're can't get more up-to-date info than that. Another explanation for added latency with vsync would be that the game engine ties the simulation step to the graphics frame rate but that is uncommon these days. In and of itself vsync is not a mechanism that adds latency. V-Sync always introduces at least a single frame of lag, since it requires at least double buffering to function. Some games render multiple frames in advance, which only compounds the problem. Further, some games have seemingly terrible implementations of V-Sync that cause inexcusable amounts of lag. Add to that the tendency for V-Sync to cause extreme variations in FPS, and you get some serious issues. All games use double buffering (at least), regardless of VSync. The only difference VSync makes is whether the graphics card waits for the next refresh to flip the front and back buffers; without it may flip it in the middle of a refresh creating an incoherent picture. A properly implemented buffer queue doesn't result in added latency even with multiple buffers (the most recent one is always selected), but some games have terrible implementations indeed. I would assume Unity does it properly. Link to comment Share on other sites More sharing options...
pnhearer Posted August 19, 2014 Share Posted August 19, 2014 I find note often than not vsync is a must for Linux gaming but on my windows box. Turn it off or give me the option to. Link to comment Share on other sites More sharing options...
vril Posted August 19, 2014 Share Posted August 19, 2014 I suspect that Pillars of Eternity Configuration window won't be in the final version, that is just a Unity Engine thing. Link to comment Share on other sites More sharing options...
Sensuki Posted August 20, 2014 Author Share Posted August 20, 2014 You can always turn off all V-Sync with the AMD Catalyst Control Center or Nvidia Control Panel(I think this one can also turn it off per application) in the meantime, but I guess you already knew that. Vsync is set to always off in my CCC, but it's not working. Games do not always use the settings in your video cards control panel. Link to comment Share on other sites More sharing options...
CatatonicMan Posted August 21, 2014 Share Posted August 21, 2014 (edited) All games use double buffering (at least), regardless of VSync. The only difference VSync makes is whether the graphics card waits for the next refresh to flip the front and back buffers; without it may flip it in the middle of a refresh creating an incoherent picture. A properly implemented buffer queue doesn't result in added latency even with multiple buffers (the most recent one is always selected), but some games have terrible implementations indeed. I would assume Unity does it properly. Vsync will still result in at least a frame of lag precisely because it has to wait. But yes, the induced frame lag should be the same for N-buffering, when N > 1, assuming they don't do it stupidly. Edited August 21, 2014 by CatatonicMan Link to comment Share on other sites More sharing options...
Derek M Posted August 22, 2014 Share Posted August 22, 2014 Hey all, not to fear this setting will be implemented in the near future. Please look forward to it. As always, thanks for the feedback, we appreciate it! 1 - Refer to this thread if you are having trouble finding any information I requested http://forums.obsidi...eport-an-issue/ Link to comment Share on other sites More sharing options...
Tartantyco Posted August 22, 2014 Share Posted August 22, 2014 It's a developer! Quick, catch him before he gets away! "You're a fool if you believe I would trust your benevolence. Step aside and you and your lackeys will be unhurt." Baldur's Gate portraits for Pillars of Eternity IXI Icewind Dale portraits for Pillars of Eternity IXI Icewind Dale 2 portraits for Pillars of Eternity [slap Aloth] Link to comment Share on other sites More sharing options...
Roby Atadero Posted August 22, 2014 Share Posted August 22, 2014 I think I saw Brian had added this in sometime this week. But on a side note, our mouse cursor is not an in-engine rendered mouse cursor. It is using the hardware windows cursor and just changing which cursor visuals are displayed. This is done to ensure that the mouse feels very responsive like it does when you are just doing other normal stuff in Windows. 3 Twitter: @robyatadero Link to comment Share on other sites More sharing options...
Sensuki Posted August 22, 2014 Author Share Posted August 22, 2014 (edited) I think I saw Brian had added this in sometime this week. But on a side note, our mouse cursor is not an in-engine rendered mouse cursor. It is using the hardware windows cursor and just changing which cursor visuals are displayed. This is done to ensure that the mouse feels very responsive like it does when you are just doing other normal stuff in Windows. Yeah I know that (hardware cursors ftw), however like I said, as a past competitive FPS gamer, I can feel my mouse float even when it's only slight I've even tweaked my BIOS for mouse responsiveness and reduced DPI latency That said, Pillars of Eternity does feel *a lot less* laggy than most other games with Vsync turned on, so the implementation of it is well done. Edited August 22, 2014 by Sensuki Link to comment Share on other sites More sharing options...
Zeckul Posted September 8, 2014 Share Posted September 8, 2014 (edited) All games use double buffering (at least), regardless of VSync. The only difference VSync makes is whether the graphics card waits for the next refresh to flip the front and back buffers; without it may flip it in the middle of a refresh creating an incoherent picture. A properly implemented buffer queue doesn't result in added latency even with multiple buffers (the most recent one is always selected), but some games have terrible implementations indeed. I would assume Unity does it properly. Vsync will still result in at least a frame of lag precisely because it has to wait. But yes, the induced frame lag should be the same for N-buffering, when N > 1, assuming they don't do it stupidly. Late reply, but: the "lag" you're talking about is the intrinsic refresh period of the monitor. You can't do better than that. You cannot be more on time than the next refresh of the monitor. If you don't want to wait and turn VSync off, your frame might not be displayed at all, or only partially. Part of the actual picture you're seeing will still be lagging one refresh period behind. It's not true that you're removing 1 frame of latency there; you're creating an incoherent picture that's partially late and partially up to date (leading to the infamous tearing artifacts. The tearing line is the division between the frame that's up-to-date and the frame that's late). At best you could say, in a sense, that you're removing 0.5 frame of latency on average, but there's no guarantee of that. The only way to properly refresh while not waiting on the monitor is to reverse the algorithm and have the monitor wait on the GPU refresh, which is what G-Sync and Adaptive-Sync do. Now with these you actually get a real 1 frame less latency. Edited September 8, 2014 by Zeckul Link to comment Share on other sites More sharing options...
CatatonicMan Posted September 8, 2014 Share Posted September 8, 2014 Late reply, but: the "lag" you're talking about is the intrinsic refresh period of the monitor. You can't do better than that. You cannot be more on time than the next refresh of the monitor. If you don't want to wait and turn VSync off, your frame might not be displayed at all, or only partially. Part of the actual picture you're seeing will still be lagging one refresh period behind. It's not true that you're removing 1 frame of latency there; you're creating an incoherent picture that's partially late and partially up to date (leading to the infamous tearing artifacts. The tearing line is the division between the frame that's up-to-date and the frame that's late). At best you could say, in a sense, that you're removing 0.5 frame of latency on average, but there's no guarantee of that. The only way to properly refresh while not waiting on the monitor is to reverse the algorithm and have the monitor wait on the GPU refresh, which is what G-Sync and Adaptive-Sync do. Now with these you actually get a real 1 frame less latency. That partial refresh is what I was referring to, assuming the V-Sync implementation itself isn't adding any latency or dropping the refresh rate to keep up. G-Sync and equivalent don't change the screen drawing delay, so they couldn't have lower latency at a given refresh rate than an equivalent display that can draw partial updates. Their advantage is that they can work at arbitrary refresh rates. As such, they'll perform better as long as the monitor isn't maxed out. Link to comment Share on other sites More sharing options...
Recommended Posts