Modern screen refresh: beyond 60hz

One of the fun things about my new gaming setup is it can render at frame rates other than 60Hz. I’ve now got it set up to run at 3440×1440@75 and it seems to be working well. Of course it won’t matter much for desktop work, but it does have an effect on game rendering. Can a human perceive it? Lots of folks say so, and this blind test confirms it.

The hardware I have is an NVidia GTX 1080 graphics card hooked to a Asus RoG Swift PG348Q display. It’s hooked up via DisplayPort 1.2, the 2009 standard. In theory there’s enough bandwidth to run 3840×2160@75, but for some reason my monitor only will show up to 60Hz unless I enable “overclocking”, where it will then display up to 100Hz. The overclocking didn’t work with the first decent quality cable I tried, but it did work with the cable included with the monitor.

The neat thing about this setup is it doesn’t have to be a constant 60Hz or 75Hz. The hardware all supports Adaptive Sync, a technology where the graphics card tells the LCD when a new frame is ready to be displayed. So games, etc can run at any speed; 20fps, 27fps, 60fps, 75fps and it displays without screen tearing. This is way better than the Vsync hack that we’ve been stuck with since the ancient days of CRTs. No more triple buffering required, no awkwardly being stuck with display rates that share common factors with the number 60, no tearing or stuttering. Just draw a frame when it’s ready.

gsync

Unfortunately this variable sync rate comes in two flavors, G-Sync and Freesync, and monitors seem to only one or the other. NVidia does G-Sync so that’s what I’m using, and fortunately the higher end monitors seem to favor G-Sync as well. G-Sync requires that an expensive NVidia scaler be installed in the monitor (and may not technically be DisplayPort Adaptive Sync at all.) Freesync is the open / cheaper standard DisplayPort, but only AMD video cards support it so far, and AMD is way behind. Hopefully this will all get better in a couple of years.