Video Production Stack Exchange is a question and answer site for engineers, producers, editors, and enthusiasts spanning the fields of video, and media creation. It's 100% free, no registration required.

AFAIK, the bitrate is the number of bits per second. So let's say I have two videos, both using the same codec, bitrate settings and resolution. If the first one has a framerate of 25fps, but the second one runs at 50fps, does that mean that the second one will have a drastically lower quality since the available bits for each second of the video have to be divided amongst twice as much frames?

Or is this effect diminished by motion compensation (e.g. by having more P-frames and less I-frames) or something like that?

2 Answers
2

If the result of increasing the frame rate is that more pixels are displayed per second, then yes, keeping the bit rate the same will almost certainly mean a loss in overall quality. Not all such losses are objectionable or even necessarily noticeable. For example, if the bit rate is 30 Mb/s and you reduce it to 15 Mb/s, probably not many people would notice.

What do you mean by "if"? Do you not know exactly if this is as a matter of fact the case or are there scenarios in which this rule does not apply? Of course you're right saying that bitrate reduction will be hardly noticeable if you have a high bitrate, in this case this is a somewhat theoretical question ... however, it might make an actual difference when you have a low bitrate to begin with.
–
Gin-SanAug 24 '14 at 23:02

1

The only scenario where this wouldn't apply is that you decrease the resolution of your video according to the increase in fps. In any other case increasing the frame rate without increasing the bit-rate will decrease the maximum information available for each frame e.g. decreasing quality. How much it decreases totally depends on your source material and bit-rate.
–
Professor Sparkles♦Aug 24 '14 at 23:25

1

Codecs today are indeed smart and you don't need to double the bitrate when you double the frame rate but I recommend at least increasing it by about a third of the original value (given that the original value wasn't too much already)
–
Professor Sparkles♦Aug 24 '14 at 23:28

1

@Gin-San - My 'if' was to cover the case where you encode 50i vs 25p, for example. It might have been clearer to just say that whenever the pixel rate increases, you get lower quality from the same bit rate. As the Professor points out, the relationship isn't linear -- modern codecs deal well with the case where added frames are substantially similar.
–
Jim MackAug 24 '14 at 23:54

Alright, thanks for making that clear. Accepted!
–
Gin-SanAug 25 '14 at 0:53

This is a complicated question that doesn't have an exact answer. In general, yes, the quality will probably be lower but the higher frame rate is higher "quality" to begin with.

With video, you have to remember that temporal information is part of the quality. If you double the frame rate, the quality of each individual frame will go down, but you will see twice as many of them and the impact of things like noise can average out amongst the frames.

Additionally, modern compression works by comparing one frame to neighboring frames. When you increase the frame rate, the amount of change between frames is reduced, so the amount of data needed to store the change is also reduced, since less changes between frames.

Purely theoretically, what most matters in terms of quality is the amount of information being put in front of our eyes over time, so it may even be theoretically possible to make a higher frame rate version of a video that is higher overall quality at the same bitrate as a lower frame rate version.

With a hypothetical perfect encoding and compression system, the rate of meaningful information presented to our hypothetical perfect eyes per second is what determines quality. The problem is that such hypothetical don't exist. Our eyes are unpredictably lossy in terms of what information they pay attention to over time, so in some cases higher frame rate provides a bigger boost in quality than others. Similarly, compression and encoding isn't perfect either, so some types of content require more effort to encode and compress, with more overhead, and others produce more noise that is best accounted for through higher frame rates, while others may require lower frame rates but higher detail per frame.

Practically, the overhead of storing parts of the frame that can't compress easily generally overcome the advantages of the elevated frame rate resulting in a loss of quality when doubling frame rate without increasing bit rate, but it isn't anywhere near as significant of a quality loss as you would get if you cut the bitrate in half.

Additionally, the quality loss of the compressed version over the original is much higher for the higher frame rate, even if the relative quality between the lower and higher frame rate versions are much closer (since the lower frame rate video was much lower quality to begin with).

You only need to increase the bitrate enough to make up for the overhead lost by inefficiencies in the encoding and compression of the extra frame data. As ProfessorSparkles pointed out, somewhere around 1/3 extra bitrate is probably a decent starting point for figuring out what you need to maintain subjective quality, but it will vary a lot based on the video content you are encoding and the codecs and data rates being used.