For a reversible Markov chain $X_{t}$ on $\mathbb{R}^{n}$ with transition kernel $K$ and stationary distribution $\pi$, it is well-known that the `spectral gap' (basically, the size of $K$ when restricted to functions orthogonal to $\pi$) of $K$ can be estimated roughly by the following Cheeger constant:

$\Phi = \inf_{0<\pi(S) <0.5} \Phi(S)$

where $\Phi(S) = \frac{\int_{S} K(x,S^{c})dx}{\pi(S)}$ - see e.g. "Applications of geometric bounds to the convergence rate of Markov chains on $\mathbb{R}^{n}$" by W.K. Yuen for this version of the Cheeger constant, and many other places in the Markov chain literature for closely related versions.

There are many excellent papers in which people show that $\Phi$ is very small', and thus that the spectral gap is also small; this can be done by finding a single bad set $S$. Can anyone give me literature pointers to nontrivial analyses which show that $\Phi$ isfairly large'? So far, I have only seen examples, such as the `local' walk on the unit interval studied by Yuen, for which everything can be calculated very explicitly. I don't care too much about the details of e.g. continuous vs discrete times/state spaces.

In case this is helpful, the examples I'm most interested in look a little bit like the n-fold product walk of a local walk on the unit interval, except that the updated coordinate moves a little bit from a point that depends on the other numbers, rather than from its old value - similar to the relationship of the Curie-Weiss model to simple random walk on the hypercube. More precisely, at time $t$ I update the walk on $[0,1]^{n}$ by choosing a coordinate $i \in [n]$ at random, and updating $X_{t+1}[i] = \frac{1}{n-1} X_{t}[i] + \epsilon_{t}$ if that is in $[0,1]$, where $\epsilon_{t} = U[-\epsilon,\epsilon]$; if it isn't in $[0,1]$, set $X_{t+1}[i] = X_{t}[i]$. In any case, for $j \neq i$, set $X_{t+1}[j] = X_{t}[j]$. In terms of parameters, I care a lot more about the dependence on $\epsilon \rightarrow 0$ than about the dependence on $n \rightarrow \infty$, but both are pretty interesting.

The book by Meyn and Tweedie "Markov Chains and Stochastic Stability" discusses many criteria for the convergence of $\lim_n\|\mu K^n - \pi\|$. Since this difference can be regarded as a size of $K^n$ on functions orthogonal to $\pi$, I guess shall help you as well.
–
IlyaMar 29 '13 at 9:19

1 Answer
1

If you focus on Markov Chains that come from random walks on undirected regular walks, then having $\Phi$ (called conductance) large is a fairly well-studied concept. Such graphs are called "expanders". Check out the excellent survey by Hoory, Linial, and Wigderson: www.cs.huji.ac.il/~nati/PAPERS/expander_survey.pdf.
Check out chapters 8 and 9 for some explicit constructions (the Margulis construction, and using the Zig-Zag product to build expanders).

You also might want to see Lovasz's survey on random walks: www.cs.elte.hu/~lovasz/erdos.pdf
Check out Section 5, and the subsections on conductance.

Here is a construction related but not the same as your example: consider the $n$-dimensional lattice where each dimension has $k$ nodes. This yields a graph, and your construction is somewhat similar to random walks on this graph. You are at some node at this graph, and then move to a uniform random neighbor in the next step (at the boundary, there is some probability of just staying). This is in general not an expander.

For the extreme case of $k=2$ (the boolean hypercube), find a complete spectral analysis at
theory.stanford.edu/~trevisan/cs359g/lecture06.pdf. That proves that the expansion is $\Theta(1/n)$.