The standard deviation is a measure that is used to quantify the amount of variation or dispersion
of a set of data values.

A low standard deviation indicates that the data points tend to be close to the mean (also called the
expected value) of the set, while a high standard deviation indicates that the data points are spread
out over a wider range of values.

The coefficient of variation (cv), also known as relative standard deviation (rsd),
is a standardized measure of dispersion of a probability distribution or frequency distribution.

It is defined as the ratio of the standard deviation to the mean.

In simple terms, it gives the percentage of change. So, if the average value of a metric is 1000
and its standard deviation is 100 (meaning that it variates from 900 to 1100), then cv is 10%.

This is an easy way to check the % variation, without using absolute values.

For example, you may trigger an alarm if your web server requests/sec cv is above 20 (%)
over the last minute. So if your web server was serving 1000 reqs/sec over the last minute,
it will trigger the alarm if had spikes below 800/sec or above 1200/sec.