Figure 1. Finite state machine for a Markov chain X0 → X1 → X2 → · · · → Xn where the random variables Xi’s take values from I = {S1, S2, S3}. The numbers T (i, j)’s on the arrows are the transition probabilities such that Tij = P (Xt+1 = Sj|Xt = Si). Definition 1.2. We say that (Xn)n≥0 is a Markov chain with initial distribution λ and transition matrix T if (i) X0 has distribution λ; (ii) for ...