consider the following random walk on the lattice $\{0,\dots,n\}^2$. It starts at $(0,0)$ and then move either up or right, with probability respectively $p$ and $1-p$. Once it reaches the right border (respectively the up border), it goes up (respectively it goes right) to $(n,n)$. What properties do we know about this random walk ? Especially can we say anything about the times when it crosses the main diagonal ?

Actually I have a few more questions :). 1) Is there any name for this kind of "increasing" random walks ? I can find a lot of things on random walk that can also go backwards, but not really on the one I described. 2) If you take the same random walk on the lattice $ \\{0, 1/n, \dots, 1\\}^2 $ and let $n$ tends to infinity, what kind of process do you obtain ?
–
Seb67Sep 13 '10 at 10:48

Surely this is a random walk that can go 'backwards' with probability $0$. :)
–
alext87Sep 13 '10 at 11:36

5 Answers
5

To give some details: consider a simple random walk $Y$ on $\mathbb{Z}$ that is constrained to go left when yours goes up, and to go right when yours goes right. When your walk is on the boundary, then draw randomly independently the direction of $Y$. Since $Y$ is a simple random walk, all the results you can dream of are available. The only issue is that the two walks decorelate as soon as yours hit the boundary, but this is no problem for your question on the diagonal. Indeed, if $Z$ is the walk that is equal to $Y$ for steps $\leq n$, and equals the projection of $Y$ to $[-n+k,n-k]$ at steps $n+k$, then up to a $\sqrt{2}$ factor $Z$ is the projection on the second diagonal of your random walk, and it hits $0$ at time $t$ (meaning that your walk hits the diagonal at that time) if and only if $Y$ does.

@Benoit, projecting it onto a one-dimensional walk requires projecting it onto the 1x(2n+1) lattice, and the wrap-around conditions mean that the farthest you can get from $0$ is $n$, so it's really more like a random walk on the cycle graph $C_{2n+1}$ than a 1-dimensional random walk on $\mathbb{Z}^1$.
–
sleepless in beantownSep 13 '10 at 11:40

@sleepless in beantown: I just edited the answer to take the boundary condition into account; note that there is no wrap-around here: once on the boundary, the walk is stuck at it. If there where a wrap-around in 2D, then to take it into account in the projection would be more difficult than you suggest.
–
Benoît KloecknerSep 13 '10 at 11:49

@Benoit, you are correct. I did not notice his condition specifying an absorbing state at the right and upper walls.
–
sleepless in beantownSep 13 '10 at 12:20

The question as proposed is more like a random walk on a $C_{n+1}$, the cycle graph of size $n+1$, rather than a 1-dimensional random walk on $\mathbb{Z}^1$. This is because of the wrap-around conditions imposed by the way the problem is defined.

This is a biased random walk on a graph, with the graph being $C_{n+1}$. For simplicity, label all of the vertices of this cycle graph clockwise from $0,1,...,n-1,n$. Start the random walk at the node labeled $0$ and proceed in the negative direction (CCW, counterclockwise) with the probability $1-p$ and proceed in the positive direction (CW, clockwise) with the probability $p$.

If $p=1$ or $p=0$, then you have a deterministic process that will hit the boundary at $t=kn, k\in \mathbb{Z}$.

It can also be seen that is $p = 0.5$, you can expect to hit the wrap-around boundary at a distance of $n$ on average at time $t=n^2$, or you can come back to the center "boundary" with the standard expectation of an unbiased random walk returning to $0$ before it hits distance $-n$ or $+n$.

If $p\ne 0.5$, then there is a drift. If $p>0.5$ then there is a drift in the clockwise direction, if $p<0.5$ then there is a drift in the counterclockwise direction. Now try to find the expected hitting time for the clockwise boundary, or counterclockwise boundary, or for returning to $0$. The drift is $p-(1-p)=2p-1$.

Define the transitions of this system as the tri-diagonal stochastic matrix $T$ with $n+1$ rows and $n+1$ columns, where each element $T_{i,j}$ is

$0$ if $i=j$ or if $|i-j| > 1$

$p$ if ($i=j+1$) or ($i=1$ and $j=n+1$)

$1-p$ if ($i=j-1$) or ($i=n+1$ and $j=1$)

You can find the steady-state distibution over long periods of time to be stable: it is equally likely to be in any of the $n+1$ states with probability $1/(n+1)$.

Firstly, I take the main diagonal to mean the diagonal from $(0,n)$ to $(n,0)$. We know that it takes exactly $n$ steps until the particle is at a position $(i,n-i)$ $0\leq i\leq n$. This is because after $k\leq n^2$ steps the particle must be at a position $(j,k-j)$ for $0\leq j\leq k$ since each step increases the first coordinate position by one or the second coordinate position by one but not both.

Secondly take the main diagonal to be from $(0,0)$ to $(n,n)$. Let $N_k\sim\text{Bin}(k,p)$ be the number of up steps after $k$ steps. For the particle to be at the a point $(i,i)$ we must have $N_k=i$ and a total of $2i$ steps must have been taken. Thus the probability that after $k$ steps the particle is on the diagonal is $0$ if $k$ is odd and
\begin{equation}
\binom{k}{k/2}p^{\frac{k}{2}}(1-p)^{\frac{k}{2}}
\end{equation}
if $k$ is even.

I am very sorry that I did not define the "main diagonal", to me it is the one going from $(0,0)$ to $(n,n)$.
–
Seb67Sep 13 '10 at 11:01

I suspect Seb67 may assume that the "main diagonal" goes from $(0,0)$ to $(n,n)$. That's how I would interpret it. I'm not sure what "cross" means either: maybe three consecutive points $P_1$, $P_2$, $P_3$ with $P_2$ on the diagonal and $P_1$ and $P_3$ on opposite sides, or maybe just with $P_2$ on the diagonal.
–
Robin ChapmanSep 13 '10 at 11:02

Yes I am sorry for being sloppy in my definitions. I felt that the precise definition of "crossing" did not really matter since anyway we won't have have an exact formula but rather some bounds (which would probably be true for both definitions of crossing that you give, at least when $n$ is very large).
–
Seb67Sep 13 '10 at 11:09

Yep I was confused about crossing so I took it as meeting it. Seb67 did you mean Robin's interpretation?
–
alext87Sep 13 '10 at 11:10

We can give exact expressions for the probability of crossing at $(i,i)$.
–
alext87Sep 13 '10 at 11:14

The probability that the walk meets the diagonal at $(k,k)$ where $0 < k < n$ is
$${2k \choose k}p^k(1-p)^k$$
so the expected number of such meetings is
$$\sum_{k=1}^{n-1}{2k \choose k}p^k(1-p)^k$$
which I don't think has a convenient closed form.

If one insists a "crossing" must pass through the diagonal, this expectation becomes
$$\sum_{k=1}^{n-1}{2k-1 \choose k}(p^{k+1}(1-p)^k+p^k(1-p)^{k+1})
=\sum_{k=1}^{n-1}{2k-1 \choose k}p^k(1-p)^k.$$

Yes of course the expected number of crossing is easy to compute. But can you say anything about the expected time you have to wait for the first crossing (where I define crossing as simply touching the diagonal) ?
–
Seb67Sep 13 '10 at 11:53

1

That's more-or-less the same problem as considering the first expected return of an asymmetric one-dimensional random walk to the origin.
–
Robin ChapmanSep 13 '10 at 11:59

Yes absolutely ! I completely missed this analogy, but now I understand much better what's going on.
–
Seb67Sep 13 '10 at 12:03

@Robin Chapman, you can effectively "touch the diagonal" by wrapping around, or having the biased random walk also reach a distance of n from the origin. So consider it as a biased random walk with the end condition being either returning to the origin or getting to the distance $n$.
–
sleepless in beantownSep 13 '10 at 12:15

Now I understand a little more what you meant. The probability of crossing at $(i,i)$ occurs either from $(i,i-1)\rightarrow (i,i)\rightarrow (i,i+1)$ (occurs with probability $(1-p)^3$) or $(i-1,i)\rightarrow (i,i)\rightarrow (i+1,i)$ (occurs with proability $p^3$). We get to $(i,i-1)$ with probability $\binom{2i-1}{i}(1-p)^{i}p^{i-1}$ and we get to $(i-1,i)$ with probability $\binom{2i-1}{i}(1-p)^{i-1}p^i$. Thus the probability of the particle crossing at $(i,i)$ is
\begin{equation}
\binom{2i-1}{i}(1-p)^{i}p^{i+1}+\binom{2i-1}{i}(1-p)^{i+1}p^{i} = \binom{2i-1}{i}(1-p)^{i}p^{i}
\end{equation}