To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

From Wikipedia, the free encyclopedia

In the mathematical theory of probability, the entropy rate or source information rate is a function assigning an entropy to a stochastic process.

For a strongly stationary process, the conditional entropy for latest random variable eventually tend towards this rate value.

YouTube Encyclopedic

  • 1/3
    Views:
    371
    4 381
    3 575
  • Chapter 2 Information Measures - Section 2.10 Entropy Rate of a Stationary Source
  • how to calculate image / source code Entropy in Digital Image Processing
  • Unit 8: Maximum Entropy, Lecture 2 | MIT 6.050J Information and Entropy, Spring 2008

Transcription

Definition

A process with a countable index gives rise to the sequence of its joint entropies . If the limit exists, the entropy rate is defined as

Note that given any sequence with and letting , by telescoping one has . The entropy rate thus computes the mean of the first such entropy changes, with going to infinity. The behaviour of joint entropies from one index to the next is also explicitly subject in some characterizations of entropy.

Discussion

While may be understood as a sequence of random variables, the entropy rate represents the average entropy change per one random variable, in the long term.

It can be thought of as a general property of stochastic sources - this is the subject of the asymptotic equipartition property.

For strongly stationary processes

A stochastic process also gives rise to a sequence of conditional entropies, comprising more and more random variables. For strongly stationary stochastic processes, the entropy rate equals the limit of that sequence

The quantity given by the limit on the right is also denoted , which is motivated to the extent that here this is then again a rate associated with the process, in the above sense.

For Markov chains

Since a stochastic process defined by a Markov chain that is irreducible, aperiodic and positive recurrent has a stationary distribution, the entropy rate is independent of the initial distribution.

For example, consider a Markov chain defined on a countable number of states. Given its right stochastic transition matrix and an entropy

associated with each state, one finds

where is the asymptotic distribution of the chain.

In particular, it follows that the entropy rate of an i.i.d. stochastic process is the same as the entropy of any individual member in the process.

For hidden Markov models

The entropy rate of hidden Markov models (HMM) has no known closed-form solution. However, it has known upper and lower bounds. Let the underlying Markov chain be stationary, and let be the observable states, then we have

and at the limit of , both sides converge to the middle.[1]

Applications

The entropy rate may be used to estimate the complexity of stochastic processes. It is used in diverse applications ranging from characterizing the complexity of languages, blind source separation, through to optimizing quantizers and data compression algorithms. For example, a maximum entropy rate criterion may be used for feature selection in machine learning.[2]

See also

References

  1. ^ Cover, Thomas M.; Thomas, Joy A. (2006). "4.5. Functions of Markov chains". Elements of information theory (2nd ed.). Hoboken, N.J: Wiley-Interscience. ISBN 978-0-471-24195-9.
  2. ^ Einicke, G. A. (2018). "Maximum-Entropy Rate Selection of Features for Classifying Changes in Knee and Ankle Dynamics During Running". IEEE Journal of Biomedical and Health Informatics. 28 (4): 1097–1103. doi:10.1109/JBHI.2017.2711487. PMID 29969403. S2CID 49555941.
  • Cover, T. and Thomas, J. (1991) Elements of Information Theory, John Wiley and Sons, Inc., ISBN 0-471-06259-6 [1]
This page was last edited on 9 March 2024, at 15:22
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.