Wouldn’t the latter give you lower input lag and better gameplay even though the frame rate is lower?

  • brunomarquesbrB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Ok, let’s talk frame times first:

    1s / 90Hz = 11ms

    1s / 60Hz = 16ms

    1s / 45Hz = 22ms

    Vsync introduces a lag between the frame being ready and being display in the screen. This means input lag is heavily dependent on the screen refresh rate, the higher the refresh rate, the lower the vsync lag. Then we need to add at least one frame time to the vsync lag, which is the minimum required for us to see the new frame. So it goes like this:

    Screen / game vsync max input lag + single frame time Total input lag ^((excluding game engine/screen pixel response time))
    60Hz / 60fps 16.6ms + 16.6ms ~33ms
    90Hz / 45fps 11.1ms + 22.2ms ~33ms
    45Hz / 45fps 22.2ms + 22.2ms ~44ms

    The input lag between 60/60 and 90/45 is identical. Some game engines might work better with higher fps (they’re tie to game render frequency), so 60/60 has a slight advantage over 90/45 regarding input lag overall.

    60/60 shows 4 frames in 66ms, but 90/45 only presents 3 frames, so 60/60 is smoother.

    90/45 uses less battery and it’s generally more stable, as it gives more times for hiccups in the frame timing.

    In conclusion: 60/60 is preferred in high pace/action games (like shooters or racing games), but 90/45 is better in slower pace games, specially open worlds with big variance in scenes (like RDR2 or Spider-man).