Markov chain

Updated About encyclopedia.com content Print Article Share Article
views updated

Markov chain In statistics, a set of sequential observations in which the probability of one member of the sequence occurring conditional on all the preceding members occurring is equal to the probability of that member occurring conditional only on the immediately preceding member occurring.