This was going to be a Twitter reply but I knew it would end up being way too long and I’d get annoyed that I was limited to text.
Thanks to the effort of a lot of dedicated members of our community, we finally have a large amount of objective, empirical data on how different controllers perform in terms of input lag. This has enabled many users to make informed decisions as to which controllers would be best for them given their tolerance for latency. One of the metrics often shared to this end is standard deviation. For anyone even vaguely familiar with standard deviation, this makes intuitive sense. Who cares if the lag on a controller is low on average if one out of three inputs is comes way later than the other two? Consistency is key because we can adjust to constant delays. While this is true for sufficiently large amounts of variance (like what we might experience on some fixed pixel displays), it starts to break down when discussing controller latency.
We have to keep in mind that even though games are designed to look and feel continuous, they are computed and displayed in discrete frames. Because of this, most games (I’m hedging a bit here, it’s probably very close to all) only poll inputs once per frame and then use those inputs to calculate game information and update the next frame. A natural consequence of this is that inputs that arrive at different times will have the exact same effect on the game as long as they both arrive before the next polling interval. Differences only start to crop up when one input misses one or more polling intervals and the other doesn’t. Since these events are discrete and finite, if we assume that the time between an input and a polling interval is random and uniform, each controller has a set probability of being read on a given frame. If we know the distribution of possible delays for a controller, we can calculate the probability that that controller will be read on the same frame as a perfectly lag-free controller would. This is the same frame probability (SFP). Old consoles could output roughly 60 frames per second (varies very slightly by console) so each polling interval is ~16.667 milliseconds (\(\frac{1000}{60}\)). This has some important consequences for any controller that never has a delay larger than that number. Let’s consider two controllers that have the following latency distributions:
The dotted red line is the length of one frame. These two controllers have the exact same mean latency (5.2857143) but very different standard deviations (2.3957871 vs 0.8975441). Imagine pressing a button on one of these controllers. The SFP for this press is the probability that the random latency of that press (sampled from its distribution above) will be smaller than the time until the next polling interval (assumed to be uniformly distributed between 0 and 16.6671). There is a way to calculate this value exactly, but it’s quite technical (lots of calculus) and I don’t think it will be very impactful. It will be more intuitive (and still very accurate) to just simulate 1,000,000 button presses and polling interval times and comparing the two. The following code does just that.
# Set random seed so results are always the same
# (Feel free to change this if you don't trust me)
set.seed(406)
# Get poll length in ms
frame <- 1000 / 60
# Set simulation count
n <- 1000000
# Simulate n times until poll interval
# These don't really need to be different, but it helps intuition
sim_1_poll <- runif(n, 0, frame)
sim_2_poll <- runif(n, 0, frame)
# Simulate n random delays from the controller distributions
# Both are scaled beta distributions (not important to know)
sim_1_delay <- rbeta(n, 2, 5) * 15 + 1
sim_2_delay <- rbeta(n, 16, 40) * 15 + 1
# Proportion of simulation 1 where time until polling was larger than delay
mean(sim_1_poll > sim_1_delay)
## [1] 0.682551
# Proportion of simulation 2 where time until polling was larger than delay
mean(sim_2_poll > sim_2_delay)
## [1] 0.683563
Despite having very different distributions, the SFP (the only thing observable to the player) is the exact same for these two controllers. This is why I strongly discourage reliance on standard deviation to determine how variable a controller’s lag is. It offers only a very poor estimate of what the end user’s experience will be and we have a much better metric for this. Unfortunately, for any controller with possible latency values larger than 16.667, an SFP estimate can’t be accurately calculated from just the mean (and misses other relevant info like the probability of being read 2 or more frames later). I’ve been working on tools to easily generate more robust statistics that I hope to be publishing soon. In the meantime, please avoid focusing on standard deviation and min/max values. The mean tells us ~90% of what we want to know and is probably more than sufficient in all relevant cases (if the max is less than 16.667 then the SFP estimate on the spreadsheet is accurate and 1 - SFP is the probability of landing on the second frame).
This assumption may break down in extreme cases. If you are able to execute frame-perfect tricks somewhat reliably or otherwise have sub-frame precision, these conclusions may not apply to you. However, any controller with a suitable average latency for this level of play is basically guaranteed to have a suitable standard deviation↩︎