The variance is the average of the squared deviations from the mean,
i.e., var=mean(abs(x-x.mean())**2).

The mean is normally calculated as x.sum()/N, where N=len(x).
If, however, ddof is specified, the divisor N-ddof is used
instead. In standard statistical practice, ddof=1 provides an
unbiased estimator of the variance of the infinite population. ddof=0
provides a maximum likelihood estimate of the variance for normally
distributed variables.

Note that for complex numbers, the absolute value is taken before
squaring, so that the result is always real and nonnegative.