Markov process
nounDefinition of Markov process
: a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous
also
: markov chain
— called also Markoff process
Keep scrolling for more