Additive Markov chain

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

In probability theory, an additive Markov chain is a Markov chain with an additive conditional probability function. Here the process is a discrete-time Markov chain of order m and the transition probability to a state at the next time is a sum of functions, each depending on the next state and one of the m previous states.

Definition

[edit | edit source]

An additive Markov chain of order m is a sequence of random variables X1X2X3, ..., possessing the following property: the probability that a random variable Xn has a certain value xn under the condition that the values of all previous variables are fixed depends on the values of m previous variables only (Markov chain of order m), and the influence of previous variables on a generated one is additive,

Pr(Xn=xnXn1=xn1,Xn2=xn2,,Xnm=xnm)=r=1mf(xn,xnr,r).

Binary case

[edit | edit source]

A binary additive Markov chain is where the state space of the chain consists on two values only, Xn ∈ { x1x2 }. For example, Xn ∈ { 0, 1 }. The conditional probability function of a binary additive Markov chain can be represented as

Pr(Xn=1Xn1=xn1,Xn2=xn2,)=X¯+r=1mF(r)(xnrX¯),
Pr(Xn=0Xn1=xn1,Xn2=xn2,)=1Pr(Xn=1Xn1=xn1,Xn2=xn2,).

Here X¯ is the probability to find Xn = 1 in the sequence and F(r) is referred to as the memory function. The value of X¯ and the function F(r) contain all the information about correlation properties of the Markov chain.

Relation between the memory function and the correlation function

[edit | edit source]

In the binary case, the correlation function between the variables Xn and Xk of the chain depends on the distance nk only. It is defined as follows:

K(r)=(XnX¯)(Xn+rX¯)=XnXn+rX¯2,

where the symbol denotes averaging over all n. By definition,

K(r)=K(r),K(0)=X¯(1X¯).

There is a relation between the memory function and the correlation function of the binary additive Markov chain:[1]

K(r)=s=1mK(rs)F(s),r=1,2,.

See also

[edit | edit source]

Notes

[edit | edit source]
  1. ^ S.S. Melnyk, O.V. Usatenko, and V.A. Yampol'skii. (2006) "Memory functions of the additive Markov chains: applications to complex dynamic systems", Physica A, 361 (2), 405–415 Lua error in Module:Citation/CS1/Configuration at line 2172: attempt to index field '?' (a nil value).

References

[edit | edit source]
  • A.A. Markov. (1906) "Rasprostranenie zakona bol'shih chisel na velichiny, zavisyaschie drug ot druga". Izvestiya Fiziko-matematicheskogo obschestva pri Kazanskom universitete, 2-ya seriya, tom 15, 135–156
  • A.A. Markov. (1971) "Extension of the limit theorems of probability theory to a sum of variables connected in a chain". reprinted in Appendix B of: R. Howard. Dynamic Probabilistic Systems, volume 1: Markov Chains. John Wiley and Sons
  • Lua error in Module:Citation/CS1/Configuration at line 2172: attempt to index field '?' (a nil value).
  • Lua error in Module:Citation/CS1/Configuration at line 2172: attempt to index field '?' (a nil value).
  • Ramakrishnan, S. (1981) "Finitely Additive Markov Chains", Transactions of the American Mathematical Society, 265 (1), 247–272 JSTOR 1998493