Markov chain zanjire-ye Markov (#) Fr.: chaîne de Markov A → stochastic process, based on the classical → random walk concept, in which the probabilities of occurrence of various future states depend only on the previous state of the system and not on any of earlier states. Also called Markov process and Markovian principle. Named after Andrey Andreyevich Markov (1856-1922), a Russian mathematician, who introduced this model in 1906; → chain. |
Markov Chain Monte Carlo (MCMC) raveš-e Monte Carlo bâ zanjire-ye Markov Fr.: Méthode de Monte-Carlo par chaînes de Markov A method for sampling from → probability distributions using → Markov chains. MCMC methods are widely used in data modeling for → Bayesian inference and numerical integration in physics, chemistry, biology, statistics, and computer science. |