My friend and I have been stumped on this problem for a little while and I thought asking for tips couldn't hurt (we did ask the teacher, but we got other problems after)

Here is the question :

Let $\{X_n\}_{n \geq 1}$ be a sequence of random variables defined on the same probability space $(\Omega, F, \mathbb{P})$ with the same law of finite expected value (E(|X_1|)<\infty ). Let

$Y_n = n^{-1} \max_{1 \leq i \leq n} |X_i|$.

Show that

$\lim_{n\rightarrow \infty} E(Y_n) = 0$

and

$Y_n \rightarrow 0$ almost surely.

We have ideas of many parts of the proof, for example for the first one it would suffice to show that the expected value of the max of all $|X_i|$ is finite... and since the max is one of the $|X_i|$ for each $\omega \in \Omega$ it seems reasonable but we're not sure how to show it.

We also tried splitting the integral for the expected value into a partition of $\Omega$ considering the sets on which $X_i$ is the max, but didn't get too far with that.

For the second part, I think we could show it if we knew that $X_i(\omega)$ diverges for only a measure 0 set, but it's not that obvious (I think).

It's been a long time since I had anything to do with random variables and the like, but I can't see how this isn't just plain wrong? E.g. take $\Omega = \{\omega\}$ (IOW a singleton) and $X_n(\omega) = n^2$
–
kahenMar 17 '11 at 23:08

2 Answers
2

Assume without loss of generality that $X_1$ is almost surely nonnegative.

Almost sure convergence

This is a consequence of the first Borel-Cantelli lemma. To see this, fix any positive $x$. Then $P(X_n\ge nx)=P(X_1\ge nx)$ for every $n$ and
$$\sum_{n\ge1}P(X_1\ge nx)\le x^{-1}E(X_1),
$$
hence the series of general term $P(X_n\ge nx)$ converges. By the first Borel-Cantelli lemma, the limsup of the events $[X_n\ge nx]$ has probability $0$. This means that $X_n<nx$ for every $n$ large enough and can be translated as $X_n\le Z+nx$ for every $n$, for an almost surely finite $Z$. Hence $nY_n\le Z+nx$ for every $n$, and $\limsup Y_n\le x$ almost surely. This holds for every positive $x$, hence $Y_n\to0$ almost surely.

Convergence of the expectations

Two ingredients are useful here: the fact that, for every nonnegative $Z$, $E(Z)$ is the integral of the function $x\mapsto P(Z\ge x)$ over $x\ge0$, and the fact that if $nY_n\ge x$, then $X_k\ge x$ for at least one integer $k$ between $1$ and $n$, hence $P(nY_n\ge x)\le nP(X_1\ge x)$.

Thanks to the first ingredient, $E(Y_n)$ is the integral of $g_n$ with $g_n(x)=P(nY_n\ge x)/n$. Thanks to the second ingredient, $g_n(x)\le P(X_1\ge x)=g_1(x)$. Now, $g_n(x)\le1/n$ hence $g_n(x)\to0$, and since $E(X_1)$ is finite, $g_1$ is integrable. By dominated convergence, the integral of $g_n$ converges to $0$, that is, $E(Y_n)\to0$.

Remark You do not say where you found the exercise but your source is to be complimented because many people add the hypothesis that the sequence $(X_n)$ is independent although it is not necessary.

Added later on The upper bound of a series by $x^{-1}E(X_1)$ used above can be proved as follows. First assume that $x=1$ and note that for any nonnegative $Z$ (random or deterministic),
$$
\sum_{n\ge1}\mathbf{1}_{Z\ge n}=\lfloor Z\rfloor\le Z,
$$
where $\lfloor \ \rfloor$ denotes the integer part. Integrating both sides of the inequality with respect to $P$ yields, for any nonnegative random variable $Z$,
$$
\sum_{n\ge1}P(Z\ge n)=E(\lfloor Z\rfloor)\le E(Z).
$$
For the case at hand, apply this inequality to the random variable $Z=x^{-1}X_1$, using
$$
\sum_{n\ge1}\mathbf{1}_{X_1\ge nx}=\lfloor x^{-1}X_1\rfloor\le x^{-1}X_1.
$$

Note first that $E \max_{i \ge 0} |X_i|$ (one should really write $\sup$ instead of $\max$) need not be finite. Indeed, if the $X_i$ are, say, iid normal, then one can show $\sup_{i \ge 0} |X_i| = +\infty$ almost surely.

To get $L^1$ convergence, the trick is to split into the events where the $X_i$ are small and where they are large. When they are small they do not contribute much to $Y_n$, and they can be large only with small probability. So fix $M$ and let $U_i = |X_i| 1_{\{|X_i| \le M\}}$, $V_i = |X_i| 1_{\{|X_i| > M\}}$. Then $Y_n \le \frac{1}{n} (\max_{i \le n} U_i + \max_{i \le n} V_i)$. The first term is bounded by $M$, and the second by $V_1 + \dots + V_n$. Taking expectations, $E Y_n \le \frac{M}{n} + E V_1$, so $\limsup_{n \to \infty} E Y_n \le E V_1$. By choosing $M$ large enough, $E V_1$ can be made as small as desired (think dominated convergence).

For almost sure convergence, the argument that went here previously was wrong.