Markov chain

noun

Definition of Markov chain

: a usually discrete stochastic process (such as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved

called also Markoff chain

First Known Use of Markov chain

1938, in the meaning defined above

History and Etymology for Markov chain

A. A. Markov †1922 Russian mathematician

Keep scrolling for more