Consider a Brownian motion on $[0;1]$. A (finite) discrete approximation of this Brownian motion consists of $N$ iid Gaussian random variables $\Delta W_i$ of variance $\frac{1}{N}$:
$$ W\left(\frac{k}{N}\right) = \sum_{i=1}^k \Delta W_i. $$
The vector $V_N = (\Delta W_1, \ldots, \Delta W_N) \in \mathbb{R}^N$ has a norm approximately equal to $1$ since the random variable $\|V_N\|^2$ has a variance equal to $\frac{C}{N}$ and $\frac{C}{N} \to 0$ (basic concentration of measure results can make this statement more precise). This is why (?) one can approximately say that in order to sample a Brownian path, it suffices to sample a point uniformly on the unit sphere of $\mathbb{R}^N$.

Question: letting $N$ go to $\infty$, how can one formalize (if possible and/or correct) the idea that a Brownian path on $[0;1]$ is like a point uniformly chosen on the unit sphere of an infinite dimensional Banach space ?

5 Answers
5

As you suggest in the question, there is no such thing as a uniform measure on the unit sphere of infinite dimensional Banach spaces, such as $L^2\equiv L^2([0,1],\lambda)$ (λ=Lebesgue measure).

Instead, you can look at cylinder measures, which are defined on locally convex vector spaces. Given a locally convex vector space V, a cylinder measure on V is really a collection of measures $\lbrace\mu_W\rbrace$ on the quotients V/W, for closed subspaces W with finite codimension. These must be consistent with the natural maps $V/W\to V/W^\prime$ when $W\subseteq W^\prime$.
Equivalently, a cylinder measure can be expressed a collection of measures $\lbrace\mu_F\rbrace$ for continuous linear maps $F\colon V\to\mathbb{R}^n$ (n=0,1,2,...), where $\mu_F$ is a measure on the codomain $\mathbb{R}^n$ of F. Then, the consistency condition says that if $F\colon V\to\mathbb{R}^n$, $g\colon\mathbb{R}^n\to\mathbb{R}^m$ are linear continuous, then $\mu_{g\circ F}=\mu_F\circ g^{-1}$. For finite dimensional spaces, cylindrical measures are just the same thing as measures on $(V,\mathcal{B}(V))$, since they are equivalent to specifying a measure on $V/\lbrace0\rbrace=V$

Writing $V^\star$ for the dual of V, cylinder measures can also be considered as measures on the infinite product space $\Omega\equiv\prod_{x\in V^\star}\mathbb{R}$, consisting of functions $\omega\colon V^\star\to\mathbb{R}$. Letting $X\colon V^\star\times\Omega\to\mathbb{R}$ be the "coordinate process" $X(x)(\omega)\equiv\omega(x)$, write $\mathcal{F}$ for the sigma-algebra generated by $\lbrace X(x)\colon x\in V^\star\rbrace$ (the sigma-algebra generated by finite dimensional distributions). Any measure on $(\Omega,\mathcal{F})$ such that the linearity condition $X(ax+by)=aX(x)+bX(y)$ is satisfied almost everywhere (for each $x,y\in V^\star$, $a,b\in\mathbb{R}$) defines a cylinder measure by restricting to finite subsets of $V^\star$. Conversely, by the Kolmogorov extension theorem, a cylinder measure uniquely extends to such a measure on $(\Omega,\mathcal{F})$. I'll also refer to such measures on $(\Omega,\mathcal{F})$ as cylinder measures.

The situation where V is a Hilbert space is simpler. By orthogonal projection, V/W is naturally isomorphic to the orthogonal complement of W, for any closed subspace $W\subseteq V$, so a cylinder measure is just a collection of measures on the finite dimensional subspaces V, consistent with orthogonal projection. Also, the inner product allows us to identify V with its dual. For any $x\in V$, I'll write $\langle X, x\rangle\equiv X(x)$.
On any Hilbert space, there is a unique canonical Gaussian (cylinder) measure, with respect to which $\langle X,x\rangle$ is normal with mean 0 and variance $\Vert x\Vert^2$. For an infinite dimensional space, if $e_1,e_2,\ldots$ is an orthonormal basis, then $\Vert X\Vert^2=\sum_n\langle X,e_n\rangle^2$ will be infinite. Instead,
$$
n^{-1}\left(\langle X,e_1\rangle^2+\cdots+\langle X,e_n\rangle^2\right)\to 1
$$
with probability one, which holds for any orthonormal sequence $e_1,e_2,\ldots$ (not necessarily a basis).

For example, if W is a standard Brownian motion on the unit interval, then the derivative $\dot W=dW/dt$ does not exist in the usual way. However, it can be considered as having a cylindrical distribution on the Hilbert space $L^2([0,1],\lambda)$,
$$
\langle\dot W,x\rangle=\int_0^1 x(t)\dot W(t)\\,dt = \int_0^t x(t)\\,dW(t)
$$
where the right hand side side is understood as a Wiener or Ito integral, and is normal with mean 0 and variance $\Vert x\Vert^2$. So, $\dot W$ has the canonical Gaussian distribution.

In some ways, the canonical Gaussian measure on infinite dimensional spaces plays a similar role to the uniform measure on unit spheres in finite dimensional spaces.

Consider the following, fairly basic statements about measures on a finite dimensional Hilbert space V which are invariant under orthogonal transformations:
There is a unique uniform probability measure on the unit sphere $S=\lbrace x\in V\colon\Vert x\Vert=1 \rbrace$, say, $\mu_S$. That is, $\mu_S$ is invariant under orthogonal transformations. Then, if X is any random variable taking values in V whose distribution is invariant under orthogonal transformations, $\hat X\equiv X/\Vert X\Vert$ will have distribution $\mu_S$ (conditioned on $X\not=0$). Then, letting $\mu_R$ be the distribution of $R=\Vert X\Vert$ on $[0,\infty)$, the distribution of X is of the form

So, any distribution which is invariant under orthogonal transformations splits up into an integral over the uniform distributions on spheres.

The cylinder probability measures invariant under orthogonal transformations on an infinite dimensional Hilbert space V split up in a similar way, if we replace "uniform distribution on the unit sphere" by "canonical Gaussian distribution". Suppose that the coordinate process X on $(\Omega,\mathcal{F})$, defined as above, has such a distribution. So, $\langle AX, x\rangle\equiv\langle X,A^tx\rangle$ has the same distribution as $\langle X,x\rangle$, for each orthogonal transformation A. Then, there is a nonnegative random variable R such that
$$
n^{-1}\left(\langle X,e_1\rangle^2+\cdots+\langle X,e_n\rangle^2\right)\to R^2
$$
almost surely, as n tends to infinity. This holds for any orthonormal sequence $e_1,e_2,\ldots$ in V. Conditioned on $\lbrace R=0\rbrace$, $\langle X,x\rangle=0$ almost surely, for each x. Also, conditioned on the set $\lbrace R\not=0\rbrace$, X/R has the canonical Gaussian measure.

I'm not sure about a standard reference for this (I'll have a look), but it's not too difficult to prove. Given an orthonormal sequence $e_1,e_2,\ldots$, write $X_n\equiv\langle X,e_n\rangle$. The joint distribution of the random variables $X_1,X_2,\ldots$ is invariant under permuting their order. Letting $\mathcal{F_\infty}$ be the tail sigma-algebra, martingale convergence can be used to show that $R_n^2\equiv n^{-1}(X_1^2+X_2^2+\cdots+X_n^2)$ converges almost surely to $R^2=E[X_1^2\mid\mathcal{F_\infty}]$ which could, potentially, be infinite. Also, by applying an orthogonal transformation, R will not depend on the choice of orthonormal sequence. Conditioning on $\lbrace R=0\rbrace$, $E[X_1^2\mid R=0]=0$, so X=0. Conditioning on $\lbrace R_n\not=0\rbrace$, $(X_1,...,X_n)/R_n$ has the uniform distribution on the sphere of radius $\sqrt{n}$ in $\mathbb{R}^n$. From what we know about the uniform distribution on the n-sphere, $X_1/R=\lim_{n\to\infty}X_1/R_n$ has the standard normal distribution. So, $R\not=\infty$ and X/R has the canonical Gaussian distribution.

Let $\xi_n$ be a random $n$-dimensional vector with uniform distribution on the sphere {$x\in \mathbb{R}^n: |x|=1$}, $|\cdot|$ being Euclidean norm. It is well-known that the projections of $\sqrt{n}\xi_n$ on any finite number $m$ of coordinates is asymptotically Gaussian with standard independent components. I suspect that one can modify the proof of this fact to conclude that the continuous process $X_n$ on $[0,1]$ defined by $X(k/n)=\sum_{j=1}^k\xi_{n,j}$ for times of the form $k/n$ and linearly interpolated in between, converges in distribution in $C[0,1]$ to the Wiener process (one needs some maximal inequalities, of course, to prove tightness). If it is true, it should be well-known, too.

As for a direct statement, I am not that sure. The "natural" limit of the unit balls you described is the Strassen ball in the Cameron-Martin space that consists of all absolutely continuous functions $x:[0,1]\to \mathbb{R}$ such that $x(0)=0$ and $\int_0^1 \dot x^2(s)ds=1$. Although this ball is tightly related to the Wiener measure, its Wiener measure is 0. Many books on Gaussian measures or Malliavin calculus discuss these issues.

thanks - I also had in mind something related to the Cameron-Martin space, but I must admit that I am still not entirely satisfied. Of course, there might not exist any proper way to formalize this heuristic. PS: and of course, the result that you mentioned is true.
–
AlekkMar 23 '10 at 22:46

@Alekk: Do you happen to know a reference where this Functional CLT is stated/proved?
–
Yuri BakhtinMar 29 '10 at 16:57

Hi Alekk, it's not clear to me why the discretized Brownian motion is well approximated by the uniform measure on the sphere. Is there any rotational symmetry associated with the distribution of the increment vectors? But I really like the connection between a noncompact object and a compact object, in a way that does not invoke normalization.