Absorbing Markov Chain
Jump to navigation
Jump to search
An Absorbing Markov Chain is a Markov Chain that has at least one absorving state .
- Example(s):
- Counter-Example(s):
- See: Markov chain, Probability, State Probability Matrix, Absorbing State, Transient State.
References
2016
- (Wikipedia, 2016) ⇒ http://www.wikiwand.com/en/Absorbing_Markov_chain Retrieved 2016-07-24
- In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that, once entered, cannot be left.
Like general Markov chains, there can be continuous-time absorbing Markov chains with an infinite state space. However, this article concentrates on the discrete-time discrete-state-space case.
Formal definition
A Markov chain is an absorbing chain if
- In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that, once entered, cannot be left.
- there is at least one absorbing state and
- it is possible to go from any state to at least one absorbing state in a finite number of steps.
- In an absorbing Markov chain, a state that is not absorbing is called transient.