Tag Info

A stochastic process satisfying the Markov property: the distribution of the future states given the value of the current state does not depend on the past states. Use this tag for general state space processes (both discrete and continuous times); use (markov-chains) for countable state space processes.

A Markov process is a stochastic process satisfying the Markov property: the distribution of the future states given the value of the current state does not depend on the past states. This tag is used for general state space processes both in discrete and continuous time, for countable state spaces use markov-chains.