Jump to content

Recommended Posts

Posted

There is no option to disable vertical sync in the loader (where you select the resolution) or the Graphics menu in the game, please add an option to disable vertical sync as it adds mouse lag, and is overall annoying.

 

Is there any way we can disable it in a config file or somewhere for the time being?

 

Thanks in advance

 

vsync.jpg

 

vsync2.jpg

 

 

  • Like 9
Posted (edited)

Perhaps the option should indeed be there, but it should be on by default. Most players should have the game running at their display's refresh rate as it's not graphically demanding, so they'll benefit from the absence of tearing. Vsync adds no mouse lag if you're producing more frames per second than the refresh rate.

Edited by Zeckul
Posted (edited)

Yes it does. I've been a top level competitive FPS player for 10 years (retired recently) and no_one plays with vertical sync on, because even if you are getting 333 FPS and have vsync on at 120Hz, there will still be mouse lag.

 

I don't care what the default setting is, as long as you can turn it off.

Edited by Sensuki
  • Like 1
Posted (edited)

Unless the game engine is doing some weird buffering, you should always be seeing the most recent frame. Your display can't be any closer to real-time than its refresh period (well, half of it on average), so you're can't get more up-to-date info than that.

 

Another explanation for added latency with vsync would be that the game engine ties the simulation step to the graphics frame rate but that is uncommon these days. In and of itself vsync is not a mechanism that adds latency.

Edited by Zeckul
Posted (edited)

Unless the game engine is doing some weird buffering, you should always be seeing the most recent frame. Your display can't be any closer to real-time than its refresh period (well, half of it on average), so you're can't get more up-to-date info than that.

 

Another explanation for added latency with vsync would be that the game engine ties the simulation step to the graphics frame rate but that is uncommon these days. In and of itself vsync is not a mechanism that adds latency.

 

V-Sync always introduces at least a single frame of lag, since it requires at least double buffering to function. Some games render multiple frames in advance, which only compounds the problem. Further, some games have seemingly terrible implementations of V-Sync that cause inexcusable amounts of lag.

 

Add to that the tendency for V-Sync to cause extreme variations in FPS, and you get some serious issues.

Edited by CatatonicMan
Posted (edited)

You can always turn off all V-Sync with the AMD Catalyst Control Center or Nvidia Control Panel(I think this one can also turn it off per application) in the meantime, but I guess you already knew that.

Edited by Lychnidos
  • Like 1
Posted

 

Unless the game engine is doing some weird buffering, you should always be seeing the most recent frame. Your display can't be any closer to real-time than its refresh period (well, half of it on average), so you're can't get more up-to-date info than that.

 

Another explanation for added latency with vsync would be that the game engine ties the simulation step to the graphics frame rate but that is uncommon these days. In and of itself vsync is not a mechanism that adds latency.

 

V-Sync always introduces at least a single frame of lag, since it requires at least double buffering to function. Some games render multiple frames in advance, which only compounds the problem. Further, some games have seemingly terrible implementations of V-Sync that cause inexcusable amounts of lag.

 

Add to that the tendency for V-Sync to cause extreme variations in FPS, and you get some serious issues.

All games use double buffering (at least), regardless of VSync. The only difference VSync makes is whether the graphics card waits for the next refresh to flip the front and back buffers; without it may flip it in the middle of a refresh creating an incoherent picture. A properly implemented buffer queue doesn't result in added latency even with multiple buffers (the most recent one is always selected), but some games have terrible implementations indeed. I would assume Unity does it properly.

Posted

I find note often than not vsync is a must for Linux gaming but on my windows box. Turn it off or give me the option to.

Posted

I suspect that Pillars of Eternity Configuration window won't be in the final version, that is just a Unity Engine thing.

Posted

You can always turn off all V-Sync with the AMD Catalyst Control Center or Nvidia Control Panel(I think this one can also turn it off per application) in the meantime, but I guess you already knew that.

Vsync is set to always off in my CCC, but it's not working. Games do not always use the settings in your video cards control panel.

Posted (edited)

 

All games use double buffering (at least), regardless of VSync. The only difference VSync makes is whether the graphics card waits for the next refresh to flip the front and back buffers; without it may flip it in the middle of a refresh creating an incoherent picture. A properly implemented buffer queue doesn't result in added latency even with multiple buffers (the most recent one is always selected), but some games have terrible implementations indeed. I would assume Unity does it properly.

 

 

Vsync will still result in at least a frame of lag precisely because it has to wait.

 

But yes, the induced frame lag should be the same for N-buffering, when N > 1, assuming they don't do it stupidly.

Edited by CatatonicMan
Posted

I think I saw Brian had added this in sometime this week. 

 

But on a side note, our mouse cursor is not an in-engine rendered mouse cursor. It is using the hardware windows cursor and just changing which cursor visuals are displayed. This is done to ensure that the mouse feels very responsive like it does when you are just doing other normal stuff in Windows.

  • Like 3

Twitter: @robyatadero

Posted (edited)

I think I saw Brian had added this in sometime this week. 

 

But on a side note, our mouse cursor is not an in-engine rendered mouse cursor. It is using the hardware windows cursor and just changing which cursor visuals are displayed. This is done to ensure that the mouse feels very responsive like it does when you are just doing other normal stuff in Windows.

 

Yeah I know that (hardware cursors ftw), however like I said, as a past competitive FPS gamer, I can feel my mouse float even when it's only slight ;)

 

I've even tweaked my BIOS for mouse responsiveness and reduced DPI latency :p

 

That said, Pillars of Eternity does feel *a lot less* laggy than most other games with Vsync turned on, so the implementation of it is well done.

Edited by Sensuki
  • 3 weeks later...
Posted (edited)

 

 

All games use double buffering (at least), regardless of VSync. The only difference VSync makes is whether the graphics card waits for the next refresh to flip the front and back buffers; without it may flip it in the middle of a refresh creating an incoherent picture. A properly implemented buffer queue doesn't result in added latency even with multiple buffers (the most recent one is always selected), but some games have terrible implementations indeed. I would assume Unity does it properly.

 

 

Vsync will still result in at least a frame of lag precisely because it has to wait.

 

But yes, the induced frame lag should be the same for N-buffering, when N > 1, assuming they don't do it stupidly.

Late reply, but: the "lag" you're talking about is the intrinsic refresh period of the monitor. You can't do better than that. You cannot be more on time than the next refresh of the monitor. If you don't want to wait and turn VSync off, your frame might not be displayed at all, or only partially. Part of the actual picture you're seeing will still be lagging one refresh period behind. It's not true that you're removing 1 frame of latency there; you're creating an incoherent picture that's partially late and partially up to date (leading to the infamous tearing artifacts. The tearing line is the division between the frame that's up-to-date and the frame that's late). At best you could say, in a sense, that you're removing 0.5 frame of latency on average, but there's no guarantee of that.

 

The only way to properly refresh while not waiting on the monitor is to reverse the algorithm and have the monitor wait on the GPU refresh, which is what G-Sync and Adaptive-Sync do. Now with these you actually get a real 1 frame less latency.

Edited by Zeckul
Posted

Late reply, but: the "lag" you're talking about is the intrinsic refresh period of the monitor. You can't do better than that. You cannot be more on time than the next refresh of the monitor. If you don't want to wait and turn VSync off, your frame might not be displayed at all, or only partially. Part of the actual picture you're seeing will still be lagging one refresh period behind. It's not true that you're removing 1 frame of latency there; you're creating an incoherent picture that's partially late and partially up to date (leading to the infamous tearing artifacts. The tearing line is the division between the frame that's up-to-date and the frame that's late). At best you could say, in a sense, that you're removing 0.5 frame of latency on average, but there's no guarantee of that.

 

The only way to properly refresh while not waiting on the monitor is to reverse the algorithm and have the monitor wait on the GPU refresh, which is what G-Sync and Adaptive-Sync do. Now with these you actually get a real 1 frame less latency.

 

 

That partial refresh is what I was referring to, assuming the V-Sync implementation itself isn't adding any latency or dropping the refresh rate to keep up.

 

G-Sync and equivalent don't change the screen drawing delay, so they couldn't have lower latency at a given refresh rate than an equivalent display that can draw partial updates. Their advantage is that they can work at arbitrary refresh rates. As such, they'll perform better as long as the monitor isn't maxed out.

×
×
  • Create New...