Problem 2 in Section 11.2

Show that Example 11.7 is an absorbing Markov chain.

An absorbing Markov chain is a Markov chain in which it is impossible to leave some states once entered. However, this is only one of the prerequisites for a Markov chain to be an absorbing Markov chain. In order for it to be an absorbing Markov chain, all other transient states must be able to reach the absorbing state with a probability of 1.

Definition 11.1 A state \(s_i\) of a Markov chain is called absorbing if it is impossible to leave it (i.e., \(p_{ii}\)). A Markov chain is absorbing if it has at least one absorbing state, and if from every state it is possible to go to an absorbing state (not necessarily in one step).

By this definition, Example 11.7 is an Absorbing Markov chain because \(p_{11} =1\).