Looks like very simple yet...but didnt find a way until now. Thx about the link. I will take a look.
–
GarouDanOct 22 '11 at 23:05

2

Just be aware that the law of total probability $P(A) = \sum_i P(A\mid B_i)P(B_i)$ assumes $A \subset \cup_i B_i \subset \Omega$ where $\Omega$ is the entire sample space. More simply, people just assume $\cup_i B_i = \Omega$ for simplicity.
–
Dilip SarwateOct 23 '11 at 0:57

@SrivatsanNarayanan, if one fo you answer this question, I will embrace it. Someone else too. I just don't want post my own answer.
–
GarouDanOct 28 '11 at 12:50

@DilipSarwate if one fo you answer this question, I will embrace it. Someone else too. I just don't want post my own answer.
–
GarouDanOct 28 '11 at 12:50

1 Answer
1

The formula you quote seems to be just the law of total probability. Assume that the set of events $\{ H_\eta \}_{1 \leq \eta \leq \mathbb H}$ forms a partition of the sample space $\Omega$; i.e., the $H_\eta$'s are pairwise disjoint, and $\bigcup \limits_{\eta = 1}^{\mathbb H} H_\eta = \Omega$.

Now for any event $E_e$, the set of events $\{ E_e \cap H_\eta \}$ forms a partition of $E_e$. Therefore, by additivity, we have
$$
P(E_e) = \sum_{\eta = 1}^{\mathbb H} P(E_e \cap H_\eta). \tag{1}
$$
Now, by the definition of conditional probability, we have $P(E_e \cap H_\eta) = P(H_\eta) \cdot P(E_e \mid H_\eta)$. Plugging this in $(1)$ we get the claim.*

The following sentence taken from the wikipedia article explains what this theorem means intuitively (notation changed to match ours):

The summation can be interpreted as a weighted average, and consequently the marginal probability, $P(E_e)$, is sometimes called "average probability"; "overall probability" is sometimes used in less formal writings.

*The formula is true even for $\{ H_\eta \}_{\eta \geq 1}$ forms a countably infinite partition of $\Omega$. The proof has to be modified only slightly for this.