To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

# Divergence (statistics)

## From Wikipedia, the free encyclopedia

In statistics and information geometry, divergence or a contrast function is a function which establishes the "distance" of one probability distribution to the other on a statistical manifold. The divergence is a weaker notion than that of the distance, in particular the divergence need not be symmetric (that is, in general the divergence from p to q is not equal to the divergence from q to p), and need not satisfy the triangle inequality.

## Definition

Suppose S is a space of all probability distributions with common support. Then a divergence on S is a function D(· || ·): S×SR satisfying [1]

1. D(p || q) ≥ 0 for all p, qS,
2. D(p || q) = 0 if and only if p = q,

The dual divergence D* is defined as

${\displaystyle D^{*}(p\parallel q)=D(q\parallel p).}$

## Geometrical properties

Many properties of divergences can be derived if we restrict S to be a statistical manifold, meaning that it can be parametrized with a finite-dimensional coordinate system θ, so that for a distribution pS we can write p = p(θ).

For a pair of points p, qS with coordinates θp and θq, denote the partial derivatives of D(p || q) as

{\displaystyle {\begin{aligned}D((\partial _{i})_{p}\parallel q)\ \ &{\stackrel {\mathrm {def} }{=}}\ \ {\tfrac {\partial }{\partial \theta _{p}^{i}}}D(p\parallel q),\\D((\partial _{i}\partial _{j})_{p}\parallel (\partial _{k})_{q})\ \ &{\stackrel {\mathrm {def} }{=}}\ \ {\tfrac {\partial }{\partial \theta _{p}^{i}}}{\tfrac {\partial }{\partial \theta _{p}^{j}}}{\tfrac {\partial }{\partial \theta _{q}^{k}}}D(p\parallel q),\ \ \mathrm {etc.} \end{aligned}}}

Now we restrict these functions to a diagonal p = q, and denote [2]

{\displaystyle {\begin{aligned}D[\partial _{i}\parallel \cdot ]\ &:\ p\mapsto D((\partial _{i})_{p}\parallel p),\\D[\partial _{i}\parallel \partial _{j}]\ &:\ p\mapsto D((\partial _{i})_{p}\parallel (\partial _{j})_{p}),\ \ \mathrm {etc.} \end{aligned}}}

By definition, the function D(p || q) is minimized at p = q, and therefore

{\displaystyle {\begin{aligned}&D[\partial _{i}\parallel \cdot ]=D[\cdot \parallel \partial _{i}]=0,\\&D[\partial _{i}\partial _{j}\parallel \cdot ]=D[\cdot \parallel \partial _{i}\partial _{j}]=-D[\partial _{i}\parallel \partial _{j}]\ \equiv \ g_{ij}^{(D)},\end{aligned}}}

where matrix g(D) is positive semi-definite and defines a unique Riemannian metric on the manifold S.

Divergence D(· || ·) also defines a unique torsion-free affine connection(D) with coefficients

${\displaystyle \Gamma _{ij,k}^{(D)}=-D[\partial _{i}\partial _{j}\parallel \partial _{k}],}$

and the dual to this connection ∇* is generated by the dual divergence D*.

Thus, a divergence D(· || ·) generates on a statistical manifold a unique dualistic structure (g(D), ∇(D), ∇(D*)). The converse is also true: every torsion-free dualistic structure on a statistical manifold is induced from some globally defined divergence function (which however need not be unique).[3]

For example, when D is an f-divergence for some function ƒ(·), then it generates the metric g(Df) = c·g and the connection (Df) = ∇(α), where g is the canonical Fisher information metric, ∇(α) is the α-connection, c = ƒ′′(1), and α = 3 + 2ƒ′′′(1)/ƒ′′(1).

## Examples

The two most important divergences are the relative entropy (Kullback–Leibler divergence, KL divergence), which is central to information theory and statistics, and the squared Euclidean distance (SED). Minimizing these two divergences is the main way that linear inverse problem are solved, via the principle of maximum entropy and least squares, notably in logistic regression and linear regression.[4]

The two most important classes of divergences are the f-divergences and Bregman divergences; however, other types of divergence functions are also encountered in the literature. The only divergence that is both an f-divergence and a Bregman divergence is the Kullback–Leibler divergence; the squared Euclidean divergence is a Bregman divergence (corresponding to the function ${\displaystyle x^{2}}$), but not an f-divergence.

### f-divergences

This family of divergences are generated through functions f(u), convex on u > 0 and such that f(1) = 0. Then an f-divergence is defined as

${\displaystyle D_{f}(p\parallel q)=\int p(x)f{\bigg (}{\frac {q(x)}{p(x)}}{\bigg )}dx}$
 Kullback–Leibler divergence: ${\displaystyle D_{\mathrm {KL} }(p\parallel q)=\int p(x)\ln \left({\frac {p(x)}{q(x)}}\right)dx}$ squared Hellinger distance: ${\displaystyle H^{2}(p,\,q)=2\int {\Big (}{\sqrt {p(x)}}-{\sqrt {q(x)}}\,{\Big )}^{2}dx}$ Jeffreys divergence: ${\displaystyle D_{J}(p\parallel q)=\int (p(x)-q(x)){\big (}\ln p(x)-\ln q(x){\big )}dx}$ Chernoff's α-divergence: ${\displaystyle D^{(\alpha )}(p\parallel q)={\frac {4}{1-\alpha ^{2}}}{\bigg (}1-\int p(x)^{\frac {1-\alpha }{2}}q(x)^{\frac {1+\alpha }{2}}dx{\bigg )}}$ exponential divergence: ${\displaystyle D_{e}(p\parallel q)=\int p(x){\big (}\ln p(x)-\ln q(x){\big )}^{2}dx}$ Kagan's divergence: ${\displaystyle D_{\chi ^{2}}(p\parallel q)={\frac {1}{2}}\int {\frac {(p(x)-q(x))^{2}}{p(x)}}dx}$ (α,β)-product divergence: ${\displaystyle D_{\alpha ,\beta }(p\parallel q)={\frac {2}{(1-\alpha )(1-\beta )}}\int {\Big (}1-{\Big (}{\tfrac {q(x)}{p(x)}}{\Big )}^{\!\!{\frac {1-\alpha }{2}}}{\Big )}{\Big (}1-{\Big (}{\tfrac {q(x)}{p(x)}}{\Big )}^{\!\!{\frac {1-\beta }{2}}}{\Big )}p(x)dx}$

### Bregman divergences

Bregman divergences correspond to convex functions on convex sets. Given a strictly convex, continuously-differentiable function F on a convex set, known as the Bregman generator, the Bregman divergence measures the convexity of: the error of the linear approximation of F from q as an approximation of the value at p:

${\displaystyle D_{F}(p,q)=F(p)-F(q)-\langle \nabla F(q),p-q\rangle .}$

The dual divergence to a Bregman divergence is the divergence generated by the convex conjugate F* of the Bregman generator of the original divergence. For example, for the squared Euclidean distance, the generator is ${\displaystyle x^{2}}$, while for the relative entropy the generator is the negative entropy ${\displaystyle x\log x}$.

## History

The term "divergence" for a statistical distance was used informally in various contexts from c. 1910 to c. 1940. Its formal use dates at least to Bhattacharyya (1943), entitled "On a measure of divergence between two statistical populations defined by their probability distributions", which defined the Bhattacharyya distance, and Bhattacharyya (1946), entitled "On a Measure of Divergence between Two Multinomial Populations", which defined the Bhattacharyya angle. The term was popularized by its use for the Kullback–Leibler divergence in Kullback & Leibler (1951), its use in the textbook Kullback (1959), and then by Ali & Silvey (1966) generally, for the class of f-divergences. The term "Bregman distance" is still found, but "Bregman divergence" is now preferred. In information geometry, alternative terms were initially used, including "quasi-distance" Amari (1982, p. 369) and "contrast function" Eguchi (1985), though "divergence" was used in Amari (1985) for the α-divergence, and has become standard (e.g., Amari & Cichocki (2010)).

## References

• Amari, Shun-ichi; Nagaoka, Hiroshi (2000). Methods of information geometry. Oxford University Press. ISBN 0-8218-0531-2.
• Eguchi, Shinto (1985). "A differential geometric approach to statistical inference on the basis of contrast functionals". Hiroshima mathematical journal. 15 (2): 341–391.
• Eguchi, Shinto (1992). "Geometry of minimum contrast". Hiroshima mathematical journal. 22 (3): 631–647.
• Matumoto, Takao (1993). "Any statistical manifold has a contrast function — on the C³-functions taking the minimum at the diagonal of the product manifold". Hiroshima mathematical journal. 23 (2): 327–332.
This page was last edited on 9 April 2019, at 20:36
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.