I’ve been playing a lot of Overwatch, on PS4. It’s fun! Also it feels super responsive and reliable on the network, I almost never perceive lag. Blizzard is uniquely good at building games like this so it’s no surprise, but what are they doing to make an Internet game feel like real time?
This video from Blizzard developers is good info (but outdated; keep reading). It goes into detail into server prediction, client prediction, and perceived lag. I think they’re mostly doing what other games have in the past, but there’s a lot of choices and tradeoffs they make. One choice they say is new is they make the assumption that most clients have reliable, fast networking. That means they don’t need to buffer nearly as much. Specifically they say that with most games you may be buffered 4 ticks (or 80ms), but if they think your network is good they’ll buffer you just 1 tick (20ms), so it feels even more real time.
The server runs the simulation at 62.5Hz. When that video was posted, clients got network updates at 20.8Hz. In August they pushed an update so PC clients get updates at 62.5Hz. PS4 and XBone also got the faster tick rate on September 13. The rate is adaptive, so your client will drop down if you have low bandwidth.
The game is a UDP protocol. I’m not sure about bandwidth usage. Back in the 20.8Hz days this post claimed it’s about 60-100Kbps in both directions. So maybe triple that? Bandwidth consumption was the reason Blizzard originally went with the lower tick rate.
I’m really curious why it’s 62.5Hz and not the obvious 60Hz. Maybe you want the network updates to be just a bit faster than the screen updates for smoothness?