Definitions

markov-process

Markov process

A Markov process, named after the Russian mathematician Andrey Markov, is a mathematical model for the random evolution of a memoryless system. Often the property of being 'memoryless', the Markov property, is expressed such that conditional on the present state of the system, its future and past are independent.

Mathematically, the Markov process is expressed as for any n and t_1

P[x(t_n) le x_n~|~x(t)~forall~t le t_{n-1}] = P[x(t_n) le x_n~|~x(t_{n-1})].,!

Often, the term Markov chain is used to mean a discrete-time Markov process. Also see continuous-time Markov process.

Mathematically, if X(t), t > 0, is a stochastic process, the Markov property states that

mathrm{Pr}big[X(t+h) = y ,|, X(s) = x(s), forall s leq tbig] = mathrm{Pr}big[X(t+h) = y ,|, X(t) = x(t)big], quad forall h > 0.

Markov processes are typically termed (time-) homogeneous if

mathrm{Pr}big[X(t+h) = y ,|, X(t) = xbig] = mathrm{Pr}big[X(h) = y ,|, X(0) = x(0)big], quad forall t, h > 0,
and otherwise are termed (time-) inhomogeneous (or (time-) nonhomogeneous). Homogeneous Markov processes, usually being simpler than inhomogeneous ones, form the most important class of Markov processes.

In some cases, apparently non-Markovian processes may still have Markovian representations, constructed by expanding the concept of the 'current' and 'future' states. For example, let X be a non-Markovian process. Then define a process Y, such that each state of Y represents a time-interval of states of X, i.e. mathematically,

Y(t) = big{ X(s): s in [a(t), b(t)] , big}.
If Y has the Markov property, then it is a Markovian representation of X. In this case, X is also called a second-order Markov process. Higher-order Markov processes are defined analogously.

An example of a non-Markovian process with a Markovian representation is a moving average time series.

See also

References

Search another word or see markov-processon Dictionary | Thesaurus |Spanish
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature
FAVORITES
RECENT

;