Consider a finite state, irreducible Markov chain with a rate matrix $Q$ and a stationary distribution $\pi$. Suppose the chain starts with the initial distribution $p$ at time $0$, then at time $t$ the distribution is given by $$p_t = p*e^{Qt}$$
The relative entropy of $p_t$ with respect to the $\pi$ (also known as the Kullback-Leibler divergence $D(p_t || \pi)$) is given by
$$D(p_t || \pi) = \sum_i p_t(i) \log \left(\frac{p_t(i)}{\pi(i)}\right)$$
where $i$ ranges over all the states.