As I have mentioned before in these notes, when working with processes in continuous time, it is important to select a good modification. Typically, this means that we work with processes which are left or right continuous. However, in general, it can be difficult to show that the paths of a process satisfy such pathwise regularity. In this post I show that for optional and predictable processes, the section theorems introduced in the previous post can be used to considerably simplify the situation. Although they are interesting results in their own right, the main application in these notes will be to optional and predictable projection. Once the projections are defined, the results from this post will imply that they preserve certain continuity properties of the process paths.

Suppose, for example, that we have a continuous-time process X which we want to show to be right-continuous. It is certainly necessary that, for any sequence of times decreasing to a limit , almost-surely tends to . However, even if we can prove this for every possible decreasing sequence , it does not follow that X is right-continuous. As a counterexample, if is any continuously distributed random time, then the process is not right-continuous. However, so long as the distribution of has no atoms, X is almost-surely continuous at each fixed time t. It is remarkable, then, that if we generalise to look at sequences of stopping times, then convergence in probability along decreasing sequences of stopping times is enough to guarantee everywhere right-continuity of the process. At least, it is enough so long as we restrict consideration to optional processes.

As usual, we work with respect to a complete filtered probability space. Two processes are considered to be the same if they are equal up to evanescence, and any pathwise property is said to hold if it holds up to evanescence. That is, a process is right-continuous if and only is it is everywhere right-continuous on a set of probability 1. All processes will be taken to be real-valued, and a process is said to have left (or right) limits if its left (or right) limits exist everywhere, up to evanescence, and are finite.

X is right-continuous if and only if in probability, for each uniformly bounded sequence of stopping times decreasing to a limit .

X has right limits if and only if converges in probability, for each uniformly bounded decreasing sequence of stopping times.

X has left limits if and only if converges in probability, for each uniformly bounded increasing sequence of stopping times.

The `only if’ parts of these statements is immediate, since convergence everywhere trivially implies convergence in probability. The importance of this theorem is in the `if’ directions. That is, it gives sufficient conditions to guarantee that the sample paths satisfy the respective regularity properties.

Note that conditions for left-continuity are absent from the statements of Theorem 1. In fact, left-continuity does not follow from the corresponding property along sequences of stopping times. Consider, for example, a Poisson process, X. This is right-continuous but not left-continuous. However, its jumps occur at totally inaccessible times. This implies that, for any sequence of stopping times increasing to a finite limit , it is true that converges almost surely to . In light of such examples, it is even more remarkable that right-continuity and the existence of left and right limits can be determined by just looking at convergence in probability along monotonic sequences of stopping times. Theorem 1 will be proven below, using the optional section theorem.

Theorems 1 and 2 can alternatively be stated in terms of convergence of expectations. So long as the processes satisfy sufficient integrability properties, then convergence of expectations is a significantly weaker condition than convergence in probability. For example, if X is a uniformly bounded process then, by bounded convergence, all of the limits along sequences of stopping times stated in the results above still hold after taking expectations. More generally, using convergence of expectations for uniformly integrable sequences, requiring X to be of class (DL) is sufficient for this to be true. In fact, at the cost of dropping the `only if’ directions of the statements, we only require the even milder property that X is integrable at the necessary random times.

Theorem 3 Let X be an optional process such that is integrable for each uniformly bounded stopping time . Then,

if is integrable and for each uniformly bounded sequence of stopping times decreasing to a limit , then X is right-continuous.

if converges for each uniformly bounded decreasing sequence of stopping times, then X has right limits.

if converges for each uniformly bounded increasing sequence of stopping times, then X has left limits.

This is almost exactly the same as Theorem 1 other than the use of the weaker condition of convergence in expectation in place of convergence in probability. In a similar way, Theorem 2 can be restated using the weaker conditions of convergence in expectation.

Theorem 4 Let X be a predictable process such that is integrable for each uniformly bounded predictable stopping time . Then,

if is integrable and for each uniformly bounded sequence of predictable stopping times decreasing to a limit , then X is right-continuous.

if for each uniformly bounded sequence of predictable stopping times increasing to a limit , then X is left-continuous.

if converges for each uniformly bounded decreasing sequence of predictable stopping times, then X has right limits.

if converges for each uniformly bounded increasing sequence of predictable stopping times, then X has left limits.

The proofs of the above theorems will be given soon. Note that the first statements of each of these theorems involves the limits of decreasing sequences of stopping times. We did not restrict to the cases where is itself a stopping time, which would have given slightly stronger statements. In the usual situation, where the underlying filtration is right-continuous, this is automatic in any case. For Theorems 1 and 2 it is possible to restrict to the case where is a stopping time, but this is just a small strengthening of the statements so, for simplicity, I do not do this. Similarly, the second statements of Theorems 2 and 4 involve the limits of increasing sequences of predictable stopping times. However, increasing limits of predictable stopping times are predictable, regardless of whether or not the filtration satisfies the usual conditions. So, the restriction that is predictable could be stated if prefered, but it makes no real difference.

The underlying ideas behind the above theorems are provided in Chapter IV of the book Capacités et processes stochastiques (Springer-Verlag, 1972) by Claude Dellacherie. Actually, in this post, I am considering rather more than that stated in Dellacherie, although the ideas and method of proof will be similar. For one thing, Dellacherie restricts attention to uniformly bounded processes which removes any possible issues regarding integrability and ensures that all limits are finite. He also restricts attention to convergence in expectations, and does not state the versions involving convergence in probability given in Theorems 1 and 2 above, which somewhat reduces the number of statements to be proven. Furthermore, he restricts attention to the most important cases of left-continuity for predictable processes, and right-continuity and cadlag paths for optional processes. For reference, the result for left-continuity of predictable processes is given by Theorem T24 of Dellacherie and the result for right-continuity and cadlag paths of optional processes is in Theorem T28.

Between them, Theorems 1 to 4 consist of fourteen separate statements, all of which I will prove below. The large number of statements to be proven does mean that this post is rather long, but the underlying ideas are really the same for all of them. As noted above, the `only if’ direction of the statements of Theorems 1 and 2 is immediate anyway, so I do not explicitly consider these below. The proofs will be organised by separately considering each regularity property in turn — right-continuity, right limits, left-continuity and, finally, left limits.

Continuity at stopping times

The reduction of pathwise continuity to continuity along sequences of stopping times will be done in two stages. The first step is to reduce it to continuity at each stopping time, which is done by the following lemma. Note that, we do not actually require the process to be optional or predictable here. Progressive measurability is sufficient. As we are not assuming right-continuity of the filtration, we will instead look at stopping times with respect to the right limits, of the filtration.

Lemma 5 Let X be a progressively measurable process.

If, at each bounded stopping time, X is almost surely right-continuous, then X is right-continuous everywhere (up to evanescence).

If, at each bounded stopping time, X almost surely has right limits, then X has right limits everywhere (up to evanescence).

If, at each bounded predictable stopping time, X almost surely has left limits, then X has left limits everywhere (up to evanescence).

Proof: First, note that by replacing a stopping time by for arbitrarily large real T, we can drop the constraint that the stopping times are bounded in the statements of the lemma. Instead, we require each respective condition to hold almost surely on the event .

I previously gave a proof that right-continuous adapted processes are optional, in the sense that they are measurable with respect to the sigma-algebra generated by sets for stopping times . There, it was shown that such processes can be uniformly approximated by processes which are explicitly right-continuous and generated by the required sigma-algebra. In fact, that proof only really required the weaker hypothesis that X is progressive and is almost surely right-continuous at each stopping time, which then implies the first statement of the lemma above. I now give a self-contained proof of the first statement of the lemma, although it is based on the method just mentioned.

For X to be right-continuous, we need to show that the process

is evanescent. It is enough to show that (up to evanescence) for any given .

Without loss of generality, assume that the underlying filtration is right-continuous. Let be the set of stopping times such that , up to evanescence. As is closed under taking the supremum of countable sequences, it must contain its essential supremum . To conclude that , it only needs to be shown that (almost surely). Consider

which, by the debut theorem is a stopping time. On the interval , X is confined to the range , which has width . Hence, on this interval, showing that . Whenever , the hypothesis of the first statement above says that X is right-continuous at and, hence, . However, by definition of the essential supremum, (almost surely), showing that (a.s.) as required.

The second statement can be proved in a very similar way, replacing the process Z above by

Again, we assume without loss of generality that the underlying filtration is right-continuous, and need to show that , up to evanescence, for any . As above, let be the set of stopping times for which holds up to evanescence. The essential supremum is itself in . By the hypothesis of the second statement of the lemma, the limit exists whenever . Define

which, by the debut theorem, is a stopping time. On the interval , X is confined to the range , which has width . Hence, on this interval, so . Whenever , the definition of implies that . However, by the definition of the essential supremum, showing that (a.s.) as required.

I did not include left-continuity in the lemma above. I now show that almost-sure left continuity at each predictable stopping time is sufficient although, as shown by the example of a Poisson process, it does require the hypothesis that X is predictable.

Lemma 6 Let X be a predictable process. If, at each bounded predictable stopping time, X almost surely has left limits, then X has left limits everywhere.

Proof: Letting and be the predictable processes defined by (1), left-continuity at a finite predictable stopping time is equivalent to the equality . Then, predictable section implies that up to evanescence, so X is left-continuous. ⬜

Right-Continuity

I now move on to proving sufficiency of the conditions in Theorems 1 to 4 for the respective pathwise properties to hold. Starting with right-continuity, define the right upper and lower limits of X at each time t,

(2)

We know that these are progressively measurable processes whenever X is progressive, so long as the underlying filtration is right-continuous. The following lemma will allow us to reduce the proof of right-continuity to an application of Lemma 5. Since the proofs for optional and predictable processes are almost identical, I deal with both cases simultaneously. This does entail a liberal use of the term `respectively‘ throughout these proofs, but avoids writing out near identical duplicate statements and proofs for the optional and predictable cases.

Lemma 7 Let X be an optional (respectively, predictable) process, , and be an stopping time such that whenever .

Then, there exists a sequence, , of stopping times (resp., predictable stopping times) decreasing to such that whenever .

Proof: For each n, consider the set

Using the fact that the process is left-continuous and adapted, hence predictable, and that X is optional (resp., predictable), it follows that S is optional (resp., predictable). So, by the section theorem, there exists a (resp., predictable) stopping time with and

When , the condition that implies that at some times in the interval . Hence, and,

(3)

The Borel-Cantelli lemma implies that, almost surely, for large n whenever . By construction, and whenever , so almost surely.

Finally, setting

gives a sequence of (resp., predictable) stopping times almost surely decreasing to and with whenever . The `almost sure’ restriction can be removed, if preferred, by simply setting on the zero probability event that does not tend to . ⬜

Applying Lemmas 5 and 7, it is now relatively straightforward to give a proof of the first statements of Theorems 1 and 2. That is, continuity in probability computed along decreasing sequences of stopping times is sufficient to guarantee pathwise right-continuity.

Lemma 8 Let X be an optional (resp., predictable) process such that, for each uniformly bounded sequence of stopping times (resp., predictable stopping times) decreasing to a limit , tends to in probability. Then, X is right-continuous.

Proof: Letting be an stopping time, by Lemma 5 it is enough to show that X is almost surely right-continuous at whenever . To do this, it is enough to show that (i.e., that it is right lower semicontinuous) as continuity follows by applying the same result to both and . I use proof by contradiction so, suppose that with positive probability. Then, by countable additivity, there must be an such that

(4)

holds with positive probability. By setting on the event that (4) fails, we suppose that (4) holds whenever is finite.

Lemma 7 gives a sequence of (resp., predictable) stopping times decreasing to such that whenever . Now, fix a time such that with positive probability. Then,

whenever . So, does not converge to in probability, contradicting the hypothesis of the lemma. ⬜

Similarly, the proof of the first statements of Theorems 3 and 4 is now straightforward.

Lemma 9 Let X be an optional (resp., predictable) process such that is integrable for each stopping time . Suppose that, for each uniformly bounded sequence of stopping times (resp., predictable stopping times) decreasing to a limit , tends to . Then, X is right-continuous.

Proof: Suppose that is as in the proof of Lemma 8. So, (4) holds whenever and when . Letting be a fixed time such that with positive probability,

when . So,

Taking the limit as n goes to infinity, and using dominated convergence,

contradicting the hypothesis of the lemma. ⬜

Right Limits

I now move on to the proofs of existence of right limits, which follows along very similar lines as the proof above for right-continuity. The main difference here is that, the construction of a sequence of stopping times at which X lies above a level is replaced by a sequence on which X oscillates between two levels and . This will ensure that X does not converge to any limit along such a sequence of times, allowing a proof by contradiction similar to that used for Lemma 8 to be applied. So, the following result will play the same role in the argument here as Lemma 7 did for the proof of right-continuity above. The proof also follows in a very similar way to that given for Lemma 7.

Lemma 10 Let X be an optional (resp., predictable) process, , and be a stopping time such that

(5)

whenever .

Then, there exists a sequence, , of stopping times (resp., predictable stopping times) decreasing to such that, whenever , for n even and for n odd.

Proof: Choose a sequence of (resp., predictable) stopping times with when , as follows. First, set . Then for each , supposing that has already been chosen, consider the set,

Using the fact that is left-continuous and adapted, hence predictable, and that X is optional (resp., predictable), we see that S is optional (resp., predictable). Therefore, by the section theorem, there exists a (resp., predictable) stopping time with and

By construction, whenever , we have , and or for, respectively, even and odd n. Also, when , the condition that implies that each of the inequalities and are satisfied at some times in the interval . Hence, and, as in (3) above,

So, the Borel-Cantelli lemma implies that, almost surely, for large n, whenever . After the last n for which is infinite, the sequence is then decreasing to .

We now prove the second statement of Theorem 1 and third statement of Theorem 2. That is, under the stated conditions, the paths of X have right limits everywhere. This is a consequence of Lemmas 5 and 10 above, and the proof follows in a similar way as for Lemma 8.

Proof: Letting be an stopping time, by Lemma 5 it is enough to show that X almost surely has right limits at whenever . To do this, it is enough to show that are both almost surely finite. I use proof by contradiction so, suppose that this is not the case. Then, either or with positive probability.

Consider the case where and with positive probability. Then, by countable additivity, there exists such that and inequality (5) holds with positive probability. We can suppose that (5) holds whenever by setting whenever it fails. By Lemma 10, there exists a sequence of (resp., predictable) stopping times decreasing to such that (n even) and (n odd) whenever . Choose a fixed time such that with positive probability. Then,

whenever . So, does not converge in probability, contradicting the hypothesis of the lemma.

Now consider the case where and with positive probability. The sequence of predictable stopping times decreases to and with positive probability. Hence, does not converge in probability, contradicting the hypothesis of the lemma.

The case where follows as above with replaced by . ⬜

The proof of the second statement of Theorem 3 and the third statement of Theorem 4 follow in a similar way.

Lemma 12 Let X be an optional (resp., predictable) process such that is integrable for each uniformly bounded stopping time (resp., predictable stopping time) . Suppose that, for each uniformly bounded decreasing sequence of stopping times (resp., predictable stopping times) , converges to a finite limit. Then, X has right limits.

Proof: Let be as in the proof of Lemma 11 for the case where inequality (5) holds when . Choose a fixed time such that with positive probability. Then,

Letting n go to infinity and using dominated convergence,

giving the required contradiction with the hypothesis of the lemma.

Again, as in the proof of Lemma 11, consider the case with when . By Lemma 7, there exists a sequence of (resp., predictable) stopping times decreasing to satisfying whenever . Choose a fixed time such that with positive probability. Then,

Letting n go to infinity and using dominated convergence,

The sequence of random variables is nonnegative and tends to infinity. So, by Fatou’s lemma, tends to infinity, contradicting the hypothesis of the lemma. ⬜

Left-Continuity

I now move on to left-continuity. The approach is very similar to that for right-continuity above. However, approximating stopping times from the left tends to be more difficult than from the right and, for this reason, will be required to be predictable, The following will play the same role for left-continuity as Lemma 7 did for right-continuity. The predictable processes are defined as the left upper and lower limits of X given by (1) above.

Lemma 13 Let X be an optional (resp., predictable) process, , and be a predictable stopping time such that whenever .

Then, there exists a sequence, , of stopping times (resp., predictable stopping times) increasing to such that whenever .

Proof: Let be a sequence of stopping times announcing. For each consider the set

As is the limit of the left-continuous adapted processes as m goes to infinity, it is predictable. Then, since X is optional (resp., predictable), S is also optional (resp., predictable). By the section theorem, there exists a (resp., predictable) stopping time with and

By construction, and whenever . As , for some times in the interval whenever . Hence, and, as in (3) above,

By the Borel-Cantelli lemma, when then, almost surely, for large n. So, tends to from the left. The sequence of stopping times required by the lemma is now given by

As , this sequence increases to . As the sequence (and any subsequence) tends to from the left, it must attain its minimum. So, for each n, we have for some m and, hence, whenever . Finally, for each fixed n, write as the limit of the sequence as m goes to infinity. As this is a sequence of (resp., predictable) stopping times and is eventually constant when is finite, it follows that is a stopping time (resp., a predictable stopping time). ⬜

Lemma 13 is now applied to prove the second statement of Theorem 2. The proof is analogous to the one for right-continuity given for Lemma 8 above. However, as we will apply Lemma 6, we only prove the result in the case of predictable X.

Lemma 14 Let X be a predictable process such that, for each uniformly bounded increasing sequence of predictable stopping times , tends to in probability. Then, X is left-continuous.

Proof: Letting be a finite predictable stopping time, by Lemma 6 it is enough to show that X is almost surely left-continuous at or, equivalently, . To do this, it is enough to show that , and then the inequality will follow in the same way (or, by applying the same argument to ). I use proof by contradiction so, suppose that with positive probability. By countable additivity, there must be an such that

(6)

holds with positive probability. The event on which (6) holds is -measurable, so if we set on the event that it fails then will remain a predictable stopping time. So, we suppose that (6) holds whenever is finite.

Lemma 13 gives a sequence of predictable stopping times increasing to such that whenever . Then, choosing any fixed time T with ,

whenever . This contradicts the hypothesis that in probability. ⬜

The proof of the second statement of Theorem 4 follows similarly, and is analogous to the one given for Lemma 9 above.

Lemma 15 Let X be a predictable process such that is integrable for each uniformly bounded predictable stopping time . Suppose that, for each uniformly bounded sequence of predictable stopping times increasing to a limit , tends to . Then, X is left-continuous.

Proof: As in the proof of Lemma 14, we just need to consider a predictable stopping time satisfying (6) whenever , and show that . By Lemma 13, there exists a sequence of predictable stopping times increasing to satisfying whenever .

Supposing that with positive probability, consider a fixed time T such that with positive probability. Then,

Applying dominated convergence,

giving the required contradiction. ⬜

Left Limits

I finally prove that the paths of X have left limits under the appropriate hypotheses. This extends the argument above for the case of left-continuity in a similar way to how we extended the argument above for right-continuity to prove the existence of right limits. Start with the following, which is analogous to Lemma 10 above.

Lemma 16 Let X be an optional (resp., predictable) process, , and be a predictable stopping time satisfying

(7)

whenever .

Then, there exists a sequence, , of stopping times (resp., predictable stopping times) increasing to such that whenever , for n even and for n odd.

Proof: By Lemma 13, there exists sequences, and , of (resp., predictable) stopping times increasing to such that, whenever and whenever .

Inductively define the sequence as follows. Set and, for each , set

for n even and,

for n odd. By construction, is increasing to . Also, whenever and n is even, then for some m, so . Similarly, when and n is odd.

It only remains to be shown that are (resp., predictable) stopping times, which we do by induction. Assuming that is a stopping time, the event is in for each m. This is seen by noting that is the value of the left-continuous and adapted process at time . Then, let denote the time equal to on the event and otherwise. This is a stopping time (resp., is a predictable stopping time). Considering, without loss of generality, the case where n is even, then is the limit of the eventually constant sequence

Proof: Letting be a finite predictable stopping time, by Lemma 5 it is enough to show that X almost surely has left limits at . To do this, it is enough to show that are both almost surely finite. I use proof by contradiction so, suppose that this is not the case. Then, either or with positive probability.

Consider the case where with positive probability. By countable additivity, there exists such that and inequality (7) holds with positive probability. We can suppose that (7) holds whenever by setting whenever it fails. Doing this preserves the property that is a predictable stopping time.

By Lemma 16, there exists a sequence of (predictable) stopping times increasing to such that and whenever these times are finite. Then, for any fixed time T with ,

which is positive with positive probability. So, does not converge in probability, contradicting the hypothesis of the lemma.

Now consider the case where with positive probability. Setting when this equality fails, we suppose that whenever . Lemma 13 gives a sequence, , of predictable stopping times increasing to such that whenever (we do not actually require this property of here). Then, with positive probability, contradicting the condition that converges in probability.

The case where follows as above with replaced by . ⬜

Finally, the proofs of the third statement of Theorem 3 and the fourth statement of Theorem 4 follow in a similar way.

Lemma 18 Let X be an optional (resp., predictable) process such that is integrable for each uniformly bounded stopping time (resp., predictable stopping time) . Suppose that, for each uniformly bounded increasing sequence of stopping times (resp., predictable stopping times) , converges to a finite limit. Then, X has left limits.

Proof: Let be as in the proof of Lemma 17 for the case where inequality (7) holds when . Then, choosing a fixed time T with ,

Again, as in the proof of Lemma 17, consider the case with whenever and let be (resp., predictable) stopping times with whenever . Let T be a fixed time with . The sequence is bounded below by the integrable random variable and tends to on the event so, by Fatou’s lemma,

contradicting the condition of the lemma. ⬜

Notes

Note that Theorems 1 and 2 are almost immediate consequences of Theorems 3 and 4. In the case where X is a uniformly bounded process then this is indeed the case. For example, suppose that X is optional and that in probability for any bounded sequence of stopping times decreasing to a limit . Using bounded convergence, tends to , and the first statement of Theorem 3 implies that X is right-continuous. For bounded processes, this proves the first statement of Theorem 1 as a corollary of Theorem 3. This argument can be extended to unbounded processes by applying a continuous, bounded and strictly increasing function to X. For example, . If we set and X satisfies the property that for all bounded sequences of stopping times decreasing to a limit , then Y satisfies the same property. So, and, assuming that X is optional, the first statement of Theorem 3 says that Y is right-continuous. Then, is also right-continuous. So, the first statement of Theorem 1 does indeed follow immediately from the corresponding statement of Theorem 3. Precisely the same argument applies for each of the statements of Theorems 1 and 2 which involving left and right continuity. So, it is not strictly necessary to provide proofs of these. For the statements involving left and right limits, however, it is not quite so straightforward. Even if we show that Y has right limits, for example, it does not follow that X has right limits. It could be the case that the right limits of Y are outside of the range of . Then, the right limits of can go to plus or minus infinity, and it can only be concluded that X has right limits in the extended reals. I did not use such ideas above, and instead gave proofs of all statements, as it worked out easier to prove the statements of Theorems 1 and 2 first and then extend the arguments to Theorems 3 and 4 and, in any case, the idea mentioned does not simplify things much.