Markov process

noun

Definition of Markov process

: a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous also : markov chain

called also Markoff process

First Known Use of Markov process

1938, in the meaning defined above

Keep scrolling for more