Markov source

views updated

Markov source A Markov chain, whose random variables are regarded as internal states, together with a mapping from these internal states to the symbols of some external alphabet. The mapping need not be a bijection. A Markov source is ergodic if and only if its underlying Markov chain is ergodic. See also discrete source.