I seem to remember reading somewhere that a game's frame rate is halved each time it misses a vsync. In other words, the number of frames per second decreases to 1/2, 1/4, and 1/8 the normal rate for the first, second, and third missed vysnc cycles (respectively) rather than to 1/2, 1/3, and 1/4 the normal rate.

As a quick note, keep in mind that this occurs only for double-buffered vsynched rendering. Non-vsynched rendering simply results in screen tearing, triple-buffered rendering is capable of arbitrary framerates up to the output device's maximum.
–
ZorbaTHutFeb 5 '12 at 1:36

3 Answers
3

It does not get halved for each missed vsync. Framerates go as the refresh rate of the screen divided by the number of vsync periods you need to get a new frame out, so if the refresh rate is 60 Hz (typical for TVs), then your framerate can be 60, 30, 20, 15, 12, 10, 8.6, 7.5, etc.

Yes, this makes perfect sense now. I don't know where I got the idea that the effective frame rate would be halved each time: probably just because it is halved the first time.
–
Chris FrederickFeb 5 '12 at 6:00

What vsync does is synchronize the buffer swap with your monitors vertical refresh rate. So what happens to frame rate if you miss this is that the synchronization must wait until the next refresh. So - if you're refreshing at 16ms intervals (just to keep things simple for the purposes of this explanation) but a frame needs 17ms to draw, it will miss the sync interval. Because the buffer swap is locked to the refresh rate it can't take place on the 17ms mark - it needs to wait until the next interval, which will be at 32ms. Likewise if a frame needs 33ms it will miss the intervals at 16ms and 32ms, and will need to wait until the 48ms interval.

Framerate is calculated by taking the duration of one or more past frames, averaging them, and dividing one second by the result to get the number of frames that would be drawn in a second assuming each would take as long as the average.

avgTimeMS = (totalTimeMS / numFrames)
fps = 1000 / avgTimeMS

This would yield the average number of frames per second over the last $numFrames frames. If $numFrames is 1, you get the effective frame rate of the last frame. If $numFrames is 60, you'll get the averaged frame rate over the last 60 frames (about 1 second at 60fps).

Assume we happen to have a steady framerate of 16.666ms per frame, and that we're averaging the last 10 frames, the last of which has now doubled (2 * 16.666) since we missed a vsync. We would have:

totalTimeMS = (9 * 16.666) + (1 * 33.333)
numFrames = 10
fps = 54.54

Now assume our numFrames is 1, and we skip a frame:

totalTimeMS = 1 * 33.33;
numFrames = 1
fps = 30

Your figures, then, are correct only when $numFrames is 1, since 30 is half of 60 (which I'm assuming matches vsync.)

So the perceived drop in frame rate due to one long frame is directly related to the number of frames you take in your average. When you skip a frame, you're increasing $totalTimeMS and not increasing $numFrames.