Stochastic processes (with either discrete or continuous time dependence) on a discrete (finite or countably infinite) state space in which the distribution of the next state depends only on the current state. For Markov processes on continuous state spaces please use (markov-process) instead.

A Markov chain is a stochastic process on a discrete (finite or countably infinite) space in which the distribution of the next state depends only on the current state. These objects show up in probability and computer science both in discrete-time and continuous-time models. For Markov processes on continuous spaces please use markov-process.

A discrete-time Markov chain is a sequence of random variables $\{X_n\}_{n\geq1}$ with the Markov property, namely that the probability of moving to the next state depends only on the present state and not on the previous states, i.e. $$\mathbb P(X_{n+1}=x\mid X_{1}=x_{1},X_{2}=x_{2},\ldots ,X_{n}=x_{n})=\mathbb P(X_{n+1}=x\mid X_{n}=x_{n}),$$ if both conditional probabilities are well defined, i.e. if $\mathbb P(X_{1}=x_{1},\ldots ,X_{n}=x_{n})>0.$