Discrete Markov Process
Jump to navigation
Jump to search
A Discrete Markov Process is a Markov process (a stochastic process where the probability distribution of the next state depends only on the current state and not on the sequence of states that preceded it) that is a discrete stochastic process.
- Context:
- It can be modeled using Markov Chains.
- It can range from being a Discrete-Time Discrete-Markov Process to a Continuous-Time Discrete-Markov Process.
- …
- Example(s):
- a Weather Forecasting Model, such as:
- Predicting the weather for the next day based on the current weather conditions (can be modeled using a discrete Markov chain).
- a Stock Market Model, such as:
- Predicting the stock price of a specific company based on its current stock price (can be modeled using a discrete Markov chain).
- a Queue Length Model, such as:
- Predicting the number of customers that will arrive at a store in the next hour based on the current number of customers in the store (can be modeled using a discrete Markov chain).
- …
- a Weather Forecasting Model, such as:
- Counter-Example(s):
- a Continuous Markov Process, where states and/or time are continuous.
- a Deterministic Process, where future states are entirely determined by current and/or previous states.
- See: Markov Process, Discrete-Time Discrete-Markov Process, Continuous-Time Discrete-Markov Process, Stochastic Process.