The best known and most important of these is known as the central limit theorem. It is about large numbers of random variables with the same distribution, and with a finite variance and expected value.

There are different generalisations of this theorem. Some of these generalisations no longer require an identical distribution of all random variables. In these generalisations, another precondition makes sure that no single random variable has a bigger influence on the outcome than the others. Examples are the Lindeberg and Lyapunov conditions.

The name of the theorem is based on a paper George Pólya wrote in 1920, About the Central Limit Theorem in Probability Theory and the Moment problem.[1]