For a long time I thought that $P(X+Y=k)=P(X=k-Y)$, but recently I read something like $$P(X - Y = k) = E_{Y} \Big( P(X-Y = k) \Big) = E_{Y} \Big( P(X = k+Y) \Big) =
\sum_{y=0}^{\infty} P(Y=y)P(X = k+y)$$ that makes me wonder whether it is common to consider as random variables probabilities involving random variables being equal to a quantity that involves a random variable itself, like $P(X=k-Y)$.

I have no background in Probability Theory from a measure-theoretic point of view, but from my naïve understanding an explanation could be that $$P(X=k-Y)=P(\{\omega:\omega\in X^{-1}(k-Y)\})\neq P(\{\omega:\omega\in {(X+Y)}^{-1}(k)\})=P(X+Y=k)$$ where $-1$ obviously indicates the preimage of the argument respect to the function.

In other words, in one case "$P(\cdot)$" is used as a measure of the preimage of $k$ via $X+Y$, in the other case it's used as a function of $Y$ therefore becoming itself a r.v.

Then the confusion rise because of the equal sign being improperly used in place of the set-theoretic way to write down the probability that a r.v. takes some value.

The notation $\{\omega : \omega\in X^{-1}(k-Y)\}$ is overly complicated, since the same set can be written simply as $X^{-1}(k-Y)$. One could reasonably write $\{\omega : X(\omega)=k-Y\}$ in order to define the set $X^{-1}(k-Y)$.
–
Michael HardyDec 2 '12 at 2:30

2 Answers
2

Possibly, what you read was the following (for independent $X$ and $Y$):
$$\Pr(X+Y=k)=\mathbb{E}_Y[\Pr_X(X + Y = k)].$$
Here, $\Pr_X(X + Y = k)$ is a random variable (which is a function of $Y$), and $\mathbb{E}_Y[\Pr_X(X + Y = k)]$ is its expectation.