{% include ref/post-progress.html lang = page.lang %} {% comment %} ## Markov Chain * **Markov Process** is a type of stochastic process with **Markov Property**. * **Markov Chain** is a...
Markov property states that the future evolution of the stochastic process depends only on its current state, not depending on the passed transitions of the process. Otherwise, it is not...
Markov Chain Markov Process is a type of stochastic process with Markov Property. Markov Chain is a type of Markov process with discrete state space. Markov Property Markov Property, i.e....
Stochastic Process Definition When we study the behavior of a random system, we are interested in how the system evolves in time. The evolution of the system is a random...
2018, Oct 30 — 8 minute(s) read
Have fun!
If it's not working, try turning it off and on again!