Belief in The Law of Averages is particularly prevalent in games of chance, lotteries, and most other forms of gambling*, in all of which cases it is entirely false, owing to the fact that the outcome of previous games has no bearing on the next game. (Assuming a straight game.)

Intuition will suggest that, because double six has not been thrown in the last 200 throws of the dice, it must therefore have a high probability of coming up next. This is false. In reality, it has the same probability as at any other throw, namely 1 in 36. There is no causal link between previous throws and the next. History does not come into it.

The Law of Averages is a fine example of the immense gulf that may exist between intuition or common sense, and reality. Just because something seems right, it does not necessarily follow that it is right.

* An exception is horse racing, in which prior races obviously play a role in determining outcome, and which is considered by the cognoscenti as a science rather than a game. But punters have numerous ways of turning even horse racing into a game of chance, in order that they may apply The Law of Averages -- e.g. by always betting on number 2.

Well, pardon me, Azure Monk, but I think you are taking The Law of Averages in a different sense to that which I intended. To take your example as illustration:

If a person flips a coin ten times and it comes up heads ten times, then that person's intuition might lead him to say, "OK, by the law of averages, tails should come up next". In other words, it feels like it should be tails' turn. But as we both know, the chance of tails coming up next is precisely 1 in 2.

So my point is that, taken as a basis for predicting the next outcome, The Law of Averages is completely bogus.

The Law of Averages, sometimes referred to as Maturity of Chances, is an invalid mathematical theory. This is the idea that the more often something happens, the less likely it is to happen again, and the longer something doesn't happen, the more likely it is to happen soon. This is pure hogwash. It is not a mathematical law, contrary to its name, and is in direct opposition to probability theory, which is sound mathematics.

Under the Law of Averages, if I were to flip a quarter 75 times, and it came up Heads the first 74 times, the coin should come up Tails the next time. To hear someone who believes in the Law of Averages talk, it would be inconceivable for it to come up Heads again! However, the probability remains the same as it was for the first 74 times: 50%.

You see, a coin doesn't have any memory. Neither do dice, or a roulette wheel, or any other gambling implement or situation. Each flip of the coin, or throw of the dice, or spin of the wheel, is independent of every previous flip, throw, or spin. But people continue to get locked into the confused method of thinking that since it came up Heads so many times, then, dammit, it's due for a Tails! As an experiment, flip a coin 1000 times, and keep track of the number of Heads results. The total should end up somewhere near 500, but not necessarily. I'd be greatly surprised to see it come up 500 exactly... and it could be heavily skewed in either direction. It may take ten thousand flips to get to "even." Or a million. Or it may well never achieve an exact balance.

Part of the problem that leads to this train of thought is a confusion between future odds and odds viewed from the middle of a sequence. The odds of rolling a 6 three times in a row on a standard 6-sided die is one in 216 (6 x 6 x 6). If a six comes up twice, the odds of it coming up a third time is still one in six... the previous two rolls are irrelevant to determining the third.

What is the probability of tossing a fair coin, and it landing on "heads" one, two, five, ten, 50 times in a row? One in two, one in four, one in 32, one in 1024, and one in 1.12x1015. Approximately.

What is the probability of rolling a fair die, and it landing on "1" one, two, five, ten times in a row? One in six, one in thirty-six, one in 7,776, and one in 60,466,176.

This is why the Law of Averages (also known as the Law of Large Numbers) works. It works because the probabilities of such events happening multiple times in a row is virtually impossible. Not completely impossible, no. Some day, every roll of a certain die will turn up "6". So, the world compensates by allowing all other possibilities to happen as well. Funnily enough, they all come out to be about equal.

How many heads am I going to get if I toss a coin ten thousand times? I estimate 5,008. Why? Because it is exceptionally close to the average, and unfortunately, no coin is ever quite fair, as most of them are weighted towards the tails side, therefore more heads will turn up. But only a few.

Try out the law of large numbers yourself. You can either take a day off work and spend a few mind-numbing hours rolling dice, or tossing coins... Or, you can do it a quick (but by no means preferred) way. Set up a spreadsheet, say in MS Excel, and do the following:

In cell B1, type in

=AVERAGE(A:A)

.

In cell A1, type in

=RAND()

. This generates a random number between 0 and 1, to about fifteen digits or so.

Hopefully, you would have found a value between 0 and 1 with only cell A1 filled. By the time you get to cell A50, you would have found a value between 0.45 and 0.55, and by cell A5000, it would be 0.500-something or 0.499-something. I may be out a bit, not having tried this experiment often enough to get typical numbers.

The Law of Averages is at work. Although random numbers on a computer are not, in fact, completely random, they are so close to random that this experiment will provide you with very similar results to that of tossing a coin: namely, obtaining probabilities of a shade over or under 0.5, or one in two.

By the way, if someone throws seven heads in a row, don't be suspicious. Yet. They have one chance in 128 of pulling it off. Only check the coin after about twelve in a row. Why? They now have one chance in 4096. That, for me, is enough to think that someone's lead-plated one half of the coin. The Law of Averages says that they should have thrown between about four and eight tails by now.

These are two different mathematical concepts. At first glance they appear to say the same thing, but the difference, while subtle, is of gigantic significance.

Suppose you flip a coin 10 times and it lands on heads every time.

The Law of Averages states that you are suddenly more likely to start flipping tails. The universe will magically detect this earlier "anomaly", this glut of extra heads, and go, "Hey! We have too many heads here, and they must be balanced out!" Then, the universe will alter probability itself in order to make your coin more likely to land tails, so as to correct the imbalance.

After you toss the coin another 9990 times, you will end up with roughly 5000 heads and 5000 tails; the extra 10 heads at the start have no effect.

Clearly, this is impossible. Each flip is a distinct trial, with a 50% chance of heads and 50% chance of tails. The odds do not change.

The Law Of Large Numbers states - in unambiguous mathematical terms - that, if you continue to toss your coin, the universe will NOT try to balance out the 10-heads "anomaly". Rather, the new results will make that anomaly insignificant.

On average, with the 9990 coin tosses, you can expect to get 4995 heads and 4995 tails. The final result, then, is an average of 5005 heads and 4995 tails. Not 5000 of each.

The extra 10 heads at the start do still have an effect. The Law of Large Numbers simply states that as time goes on, this effect becomes less and less significant. Clearly, a 5005:4995 ratio is approximately 5000:5000. But that tiny skew will never go away completely. It will simply become undetectable.