To install click the Add extension button. That's it.
The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.
How to transfigure the Wikipedia
Would you like Wikipedia to always look as professional and up-to-date? We have created a browser extension. It will enhance any encyclopedic page you visit with the magic of the WIKI 2 technology.
Try it — you can delete it anytime.
Install in 5 seconds
Yep, but later
4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
where and are particular values of and , respectively, is the joint probability of these values occurring together, and is defined to be 0 if .
For more than two random variables this expands to
(Eq.2)
where are particular values of , respectively, is the probability of these values occurring together, and is defined to be 0 if .
Properties
Nonnegativity
The joint entropy of a set of random variables is a nonnegative number.
Greater than individual entropies
The joint entropy of a set of variables is greater than or equal to the maximum of all of the individual entropies of the variables in the set.
Less than or equal to the sum of individual entropies
The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set. This is an example of subadditivity. This inequality is an equality if and only if and are statistically independent.[3]: 30
The above definition is for discrete random variables and just as valid in the case of continuous random variables. The continuous version of discrete joint entropy is called joint differential (or continuous) entropy. Let and be a continuous random variables with a joint probability density function. The differential joint entropy is defined as[3]: 249
(Eq.3)
For more than two continuous random variables the definition is generalized to:
(Eq.4)
The integral is taken over the support of . It is possible that the integral does not exist in which case we say that the differential entropy is not defined.
Properties
As in the discrete case the joint differential entropy of a set of random variables is smaller or equal than the sum of the entropies of the individual random variables:
The following chain rule holds for two random variables:
In the case of more than two random variables this generalizes to:[3]: 253
Joint differential entropy is also used in the definition of the mutual information between continuous random variables:
References
^D.J.C. Mackay (2003). Information theory, inferences, and learning algorithms. Bibcode:2003itil.book.....M.: 141
^Theresa M. Korn; Korn, Granino Arthur (January 2000). Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review. New York: Dover Publications. ISBN0-486-41147-8.
^ abcdefgThomas M. Cover; Joy A. Thomas (18 July 2006). Elements of Information Theory. Hoboken, New Jersey: Wiley. ISBN0-471-24195-4.
This page was last edited on 21 October 2021, at 09:50