I am studying Randomized Algorithms chapter in the book "Introduction to Algorithms" by Cormen et al.

In this chapter the book introduces the concept of an indicator random variable and state that the expected value of an indicator random variable as :

I am having difficulty understanding why this is called an indicator random variable, specifically why indicator and random and how this concept is useful in analyzing algorithm timings . It has been some time since I studied probability in school . However , I am aware of the concept behind probability. So you can base your answer on this premise.

As you can see from the diagram all it is saying is that the expected value of an indicator random variable of an event is equal to the probability of that event . We already have the concept of probability , why should we know about this new concept which happens to be the same value as the probability ?

2 Answers
2

As the name implies, an indicator random variable indicates something: the value of $I_A$ is $1$ precisely when the event $A$ occurs, and is $0$ when $A$ does not occur (that is, $A^c$ occurs). Think of $I_A$ as a Boolean variable that indicates the occurrence of the event $A$. This Boolean variable has value $1$ with probability $P(A)$ and so its average value is $P(A)$. In terms of long-term frequencies, $I_A$ will have value $1$ on roughly $N\cdot P(A)$ of $N$ trials
of the experiment, and the long-term average value of $I_A$ on these
$N$ trials will be approximately
$P(A)$.

Random because you cannot be sure whether the next time you check $I_A$, the variable will have value $1$ or value $0$, but you can be reasonably sure that over the next $10^6$ observations of $I_A$, the observed value of $I_A$ will be $1$ roughly $10^6P(A)$ times.
–
Dilip SarwateAug 7 '12 at 15:46

Therefore, the expectation is essentially the same thing as computing the expected value of a Bernoulli random variable: the value 1 times the probability that $A$ is true, plus the value 0 times the probability it is not.