Hello,
Not sure where to put this question and let me know if it needs to be put somewhere else... I'm learning about Markov Chain Matrices and I am a little confused on how to take a transition matrix and finding the fixed probability vector??? Here is the problem.. let me know if you can help me out..

This is a transition matrix

0.375 0.625 0
0.375 0.375 0.25
0.375 0.5 0.125

Find the fixed probability vector. I'm not really sure how to start this I'm really confused. Any ideas?

The result should be a 1x3 Martix

Apr 9th 2009, 10:28 PM

Soroban

Hello, stephy7878!

Quote:

Given the transition matrix: .

Find the fixed probability vector.

We want a row vector: .

We have: .

Multiply: .

The equations simplify to: .

Add [1] and [2]: . . . . whch is equivalent to [3].

So we need another equaton.
Here it is! . . . . . . . This is true for all fixed probability vectors.