Markov chain zanjire-ye Markov (#) Fr.: chaîne de Markov A → stochastic process, based on the classical → random walk concept, in which the probabilities of occurrence of various future states depend only on the previous state of the system and not on any of earlier states. Also called Markov process and Markovian principle. See also: Named after Andrey Andreyevich Markov (1856-1922), a Russian mathematician, who introduced this model in 1906; → chain. |