|
|
Line 1: |
Line 1: |
| In the mathematical theory of [[probability]], the '''entropy rate''' or '''source information rate''' of a [[stochastic process]] is, informally, the time density of the average information in a stochastic process. For stochastic processes with a [[countable]] index, the entropy rate ''H''(''X'') is the limit of the [[joint entropy]] of ''n'' members of the process ''X''<sub>''k''</sub> divided by ''n'', as ''n'' [[Limit (mathematics)|tends to]] [[infinity]]:
| | Hi there. My title is Sophia Meagher although it is not the name on my beginning certification. My wife and I reside in Mississippi but now I'm contemplating other choices. The favorite pastime for him and his kids is to perform lacross and he would never give it up. Office supervising is exactly where her main earnings comes from.<br><br>my web page: [http://appin.co.kr/board_Zqtv22/688025 online psychic reading] |
| | |
| :<math>H(X) = \lim_{n \to \infty} \frac{1}{n} H(X_1, X_2, \dots X_n)</math>
| |
| | |
| when the limit exists. An alternative, related quantity is:
| |
| | |
| :<math>H'(X) = \lim_{n \to \infty} H(X_n|X_{n-1}, X_{n-2}, \dots X_1)</math>
| |
| | |
| For [[strongly stationary]] stochastic processes, <math>H(X) = H'(X)</math>. The entropy rate can be thought of as a general property of stochastic sources; this is the [[asymptotic equipartition property]].
| |
| | |
| == Entropy rates for Markov chains ==
| |
| Since a stochastic process defined by a [[Markov chain]] that is [[irreducible]] and [[aperiodic]] has a [[stationary distribution]], the entropy rate is independent of the initial distribution.
| |
| | |
| For example, for such a Markov chain ''Y''<sub>''k''</sub> defined on a [[countable]] number of states, given the [[transition matrix]] ''P''<sub>''ij''</sub>, ''H''(''Y'') is given by:
| |
| | |
| :<math>\displaystyle H(Y) = - \sum_{ij} \mu_i P_{ij} \log P_{ij}</math> | |
| | |
| where ''μ''<sub>''i''</sub> is the [[stationary distribution]] of the chain.
| |
| | |
| A simple consequence of this definition is that the entropy rate of an [[independent and identically distributed|i.i.d.]] [[stochastic process]] has an entropy rate that is the same as the [[entropy]] of any individual member of the process.
| |
| | |
| ==See also==
| |
| * [[Information source (mathematics)]]
| |
| * [[Markov information source]]
| |
| * [[Asymptotic equipartition property]]
| |
| | |
| ==References==
| |
| | |
| * Cover, T. and Thomas, J. (1991) Elements of Information Theory, John Wiley and Sons, Inc., ISBN 0-471-06259-6 [http://www3.interscience.wiley.com/cgi-bin/bookhome/110438582?CRETRY=1&SRETRY=0]
| |
| | |
| == External links ==
| |
| * [http://www.eng.ox.ac.uk/samp Systems Analysis, Modelling and Prediction (SAMP), University of Oxford] [[MATLAB]] code for estimating information-theoretic quantities for stochastic processes.
| |
| | |
| [[Category:Information theory]]
| |
| [[Category:Entropy]]
| |
| [[Category:Markov models]]
| |
Hi there. My title is Sophia Meagher although it is not the name on my beginning certification. My wife and I reside in Mississippi but now I'm contemplating other choices. The favorite pastime for him and his kids is to perform lacross and he would never give it up. Office supervising is exactly where her main earnings comes from.
my web page: online psychic reading