Processing math: 100%

Thursday, December 11, 2014

Markov Chain (Discrete Time)


1. Definition

Let \{X_n, n =0,1,\cdots\} be a stochastic process, taking on a finite or countable number of values.

X_n is a DTMC if it has the Markov property: Given the present, the future is independent of the past
P\{X_{n+1} = j| X_n = j, X_{n-1} = i_{n-1} ,\cdots, X_1 = i_1, X_0 = i_0\} = Pr\{X_{n+1} = j| X_n = i\}

We define \Pr\{X_{n+1} = j| X_n = i\} = P_{ij}, since X_n has the stationary transition probabilities, this probability is not depend on n.

Transition probabilities satisfy \sum_j p_{ij} = 1

2. n Step transition Probabilities

P^{n+m}_{ij} = \sum_k p^m_{kj} p^n_{ik}

Proof:


3. Example: Coin Flips




4. Limiting Probabilities

Theorem: For an irreducible, ergodic Markov Chain, \pi_j = \lim_{n \to \infty} P^n_{ij} exists and is independent of the starting state i. Then \pi_j is the unique solution of \pi_j = \sum_i \pi_i P_{ij} and \sum_j \pi_j = 1.

Two interpretation for \pi_i

  • The probability of being in state i a long time into the future (large n)
  • The long-run fraction of time in state i
Note: 
  • If Markov Chain is irreducible and ergodic, then interpretation 1 and 2 are equivalent
  • Otherwise, \pi_i is still the solution to \pi = \pi P, but only interpretation 2 is valid.

No comments:

Post a Comment