Sunday, July 20, 2014

A few days ago, my friend Javier posted something about the Monty Hall Problem on Facebook. This is a well known riddle in which the use of probabilities saves the day.

In this conundrum, we are to choose one out of three doors to win a prize. Typically, the prizes are taken to be a car and two goats. After making the decision, the game host opens a door (showing a goat) and asks you if you want to switch or stay with your initial door.

Naively we can think that after the host shows one of the three doors, there is a 50% chance to win the car with the remaining two doors, so one might think that it does not make any difference to change door or not.

With a more detailed analysis (e.g. in the video) one can see that changing door is the best option. This is generally achieved using probabilities, but actually, switching doors is the most common sense thing to do.

It all relies on the concept of information.When we choose the first door, we don't have any information over the distribution of the prizes. After the host opens one of the doors, we do have information concerning this distribution. Not making any action would imply that we are neglecting that information, or in other words, wasting that information.

One realization of information is that it clarifies the uncertainty of a system. That is, first we were completely uncertain of the distribution of the prizes. Then, when the host opens a door, that uncertainty is reduced, as we know something about it. Hence, not performing any action is not taking advantage of this fact.

This is an example of Shannon's entropy. In some sense, entropy is the measure of uncertainty or disorder of a system. In this case, the system is the distribution of the prizes behind the doors. When one of the doors is revealed, the entropy decreases. Using the analogy with thermodynamics, entropy measures the quantity of energy that becomes useless to do work. Hence, decreasing the entropy means that there is more energy that potentially can be used to do work.

Similarly, in information theory, we can think of a decrease of Shannon's entropy as a way to potentially do something in the system, to make use of that information and utilize it for a goal. Not making any action in our Monty Hall Problem means that we are wasting resources and not taking the opportunity of using this potential ability of doing a something.

Changing doors is the most common sense thing to do, as naively, we can think that knowledge is power and that extra knowledge is the ability of doing something.

Similarly, data or statistical information without any reaction is a complete waste of resources. Having data or statistics and not doing anything with it besides displaying, it is like having no information at all. In fact, is much worse, as you let go the opportunity of doing something with that information.