So in a Markov chain, suppose that the algorithm is at the $\displaystyle j $th best extreme point, then after the next pivot the resulting extreme point is equally likely to be any of the $\displaystyle j-1 $ best. We want to find the expected number of transitions needed to go from state $\displaystyle i $ to state $\displaystyle 1 $, or $\displaystyle E[T_i] $.

Then why does $\displaystyle E[T_i] = 1 + \frac{1}{i-1} \sum_{j=1}^{i-1} E[T_j] $? This is only for the initial transition right? They probably conditioned on some variable, but I am not seeing it.

So in a Markov chain, suppose that the algorithm is at the $\displaystyle j $th best extreme point, then after the next pivot the resulting extreme point is equally likely to be any of the $\displaystyle j-1 $ best. We want to find the expected number of transitions needed to go from state $\displaystyle i $ to state $\displaystyle 1 $, or $\displaystyle E[T_i] $.

Then why does $\displaystyle E[T_i] = 1 + \frac{1}{i-1} \sum_{j=1}^{i-1} E[T_j] $? This is only for the initial transition right? They probably conditioned on some variable, but I am not seeing it.

If i>1 then after 1 transition it is in state 1, .. , i-1 with probability 1/(i-1).