To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Adjusted mutual information

From Wikipedia, the free encyclopedia

In probability theory and information theory, adjusted mutual information, a variation of mutual information may be used for comparing clusterings.[1] It corrects the effect of agreement solely due to chance between clusterings, similar to the way the adjusted rand index corrects the Rand index. It is closely related to variation of information:[2] when a similar adjustment is made to the VI index, it becomes equivalent to the AMI.[1] The adjusted measure however is no longer metrical.[3]

YouTube Encyclopedic

  • 1/5
    Views:
    28 728
    27 344
    21 250
    5 386
    17 834
  • An introduction to mutual information
  • The Sharpe Ratio - Risk Adjusted Return Series - Part 1
  • Treynor Ratio & Alpha | Risk Adjusted Return | Mutual funds
  • Got a DRIP? Here's how to calculate your adjusted cost base
  • Jensen's Alpha

Transcription

Mutual information of two partitions

Given a set S of N elements , consider two partitions of S, namely with R clusters, and with C clusters. It is presumed here that the partitions are so-called hard clusters; the partitions are pairwise disjoint:

for all , and complete:

The mutual information of cluster overlap between U and V can be summarized in the form of an RxC contingency table , where denotes the number of objects that are common to clusters and . That is,

Suppose an object is picked at random from S; the probability that the object falls into cluster is:

The entropy associated with the partitioning U is:

H(U) is non-negative and takes the value 0 only when there is no uncertainty determining an object's cluster membership, i.e., when there is only one cluster. Similarly, the entropy of the clustering V can be calculated as:

where . The mutual information (MI) between two partitions:

where denotes the probability that a point belongs to both the cluster in U and cluster in V:

MI is a non-negative quantity upper bounded by the entropies H(U) and H(V). It quantifies the information shared by the two clusterings and thus can be employed as a clustering similarity measure.

Adjustment for chance

Like the Rand index, the baseline value of mutual information between two random clusterings does not take on a constant value, and tends to be larger when the two partitions have a larger number of clusters (with a fixed number of set elements N). By adopting a hypergeometric model of randomness, it can be shown that the expected mutual information between two random clusterings is:

where denotes . The variables and are partial sums of the contingency table; that is,

and

The adjusted measure[1] for the mutual information may then be defined to be:

.

The AMI takes a value of 1 when the two partitions are identical and 0 when the MI between two partitions equals the value expected due to chance alone.

References

  1. ^ a b c Vinh, N. X.; Epps, J.; Bailey, J. (2009). "Information theoretic measures for clusterings comparison". Proceedings of the 26th Annual International Conference on Machine Learning - ICML '09. p. 1. doi:10.1145/1553374.1553511. ISBN 9781605585161.
  2. ^ Meila, M. (2007). "Comparing clusterings—an information based distance". Journal of Multivariate Analysis. 98 (5): 873–895. doi:10.1016/j.jmva.2006.11.013.
  3. ^ Vinh, Nguyen Xuan; Epps, Julien; Bailey, James (2010), "Information Theoretic Measures for Clusterings Comparison: Variants, Properties, Normalization and Correction for Chance" (PDF), The Journal of Machine Learning Research, 11 (oct): 2837–54

External links

This page was last edited on 4 March 2024, at 19:20
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.