variants:
or Markov\ˈmär-ˌkȯf, -ˌkȯv\ or less commonly Markoff\ˈmär-ˌkȯf\
Definition of Markovian
: of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states