Smoothing Probability Distributions and Smooth Entropy

Christian Cachin and Ueli Maurer

We introduce smooth entropy as a measure for the number of almost uniform random bits that can be extracted from a source by probabilistic
algorithms. The extraction process should be universal in the sense that it
does not require the distribution of the source to be known. Rather, it
should work for all sources with a certain structural property, such as a
bound on the maximal probability of any value. The concept of smooth
entropy unifies previous work on privacy amplification and entropy
smoothing in pseudorandom generation. It enables us to systematically
investigate the spoiling knowledge proof technique to obtain lower bounds
on smooth entropy and to show new connections to R'enyi entropy of order $\alpha > 1$.