Quiz

Relevant For...

The joint probability distribution of two random variables is a function describing the probability of pairs of values occurring. For instance, consider a random variable \(X\) that represents the number of heads in a single coin flip, and a random variable \(Y\) that represents the number of heads in a different single coin flip. Then the joint distribution of \(X\) and \(Y\) is

Formal definition

where \(P(Y = y | X = x)\) is the probability that \(Y = y\) given that \(X = x\). When \(X,Y\) are independent, this value is equal to \(P(Y = y)\) (as the fact that \(X = x\) is irrelevant), and

\[\text{Pr}[X = x, Y = y] = \text{Pr}(X=x)\text{Pr}(Y=y)\]

This also allows the joint distribution to be used as an independence test: if the above relation holds for all \(x,y\), then \(X\) and \(Y\) are independent. Otherwise, they are not.

Notably, this relation immediately rearranges to Bayes' theorem, which is very important in conditional probability.

Suppose that the probability that a random person has a certain disease is \(0.005.\) A scientist develops a device which tests a person positive for the disease with \(95\)% chance when the person really has the disease. However, the same device tests a person positive for the disease with \(1\)% chance when the person in fact does not have the disease. What is the probability that a person really has the disease when tested positive by the device?

0.75
\[\frac{95}{294}\]
0.95
\[\frac{1}{3}\]

Total probability

If \(X\) takes on the value \(x\), \(Y\) must take on some value, and so