To install click the Add extension button. That's it.
The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.
How to transfigure the Wikipedia
Would you like Wikipedia to always look as professional and up-to-date? We have created a browser extension. It will enhance any encyclopedic page you visit with the magic of the WIKI 2 technology.
Try it — you can delete it anytime.
Install in 5 seconds
Yep, but later
4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
In information theory, the Bretagnolle–Huber inequality bounds the total variation distance between two probability distributions and by a concave and bounded function of the Kullback–Leibler divergence. The bound can be viewed as an alternative to the well-known Pinsker's inequality: when is large (larger than 2 for instance.[1]), Pinsker's inequality is vacuous, while Bretagnolle–Huber remains bounded and hence non-vacuous. It is used in statistics and machine learning to prove information-theoretic lower bounds relying on hypothesis testing[2]
The following version is directly implied by the bound above but some authors[2] prefer stating it this way.
Let be any event. Then
where is the complement of .
Indeed, by definition of the total variation, for any ,
Rearranging, we obtain the claimed lower bound on .
Proof
We prove the main statement following the ideas in Tsybakov's book (Lemma 2.6, page 89),[3] which differ from the original proof[4] (see C.Canonne's note [1] for a modernized retranscription of their argument).
The proof is in two steps:
1. Prove using Cauchy–Schwarz that the total variation is related to the Bhattacharyya coefficient (right-hand side of the inequality):
2. Prove by a clever application of Jensen’s inequality that
Step 1:
First notice that
To see this, denote and without loss of generality, assume that such that . Then we can rewrite
And then adding and removing we obtain both identities.
The question is
How many coin tosses do I need to distinguish a fair coin from a biased one?
Assume you have 2 coins, a fair coin (Bernoulli distributed with mean ) and an -biased coin (). Then, in order to identify the biased coin with probability at least (for some ), at least
In order to obtain this lower bound we impose that the total variation distance between two sequences of samples is at least . This is because the total variation upper bounds the probability of under- or over-estimating the coins' means. Denote and the respective joint distributions of the coin tosses for each coin, then
We have
The result is obtained by rearranging the terms.
Information-theoretic lower bound for k-armed bandit games
In multi-armed bandit, a lower bound on the minimax regret of any bandit algorithm can be proved using Bretagnolle–Huber and its consequence on hypothesis testing (see Chapter 15 of Bandit Algorithms[2]).
History
The result was first proved in 1979 by Jean Bretagnolle and Catherine Huber, and published in the proceedings of the Strasbourg Probability Seminar.[4] Alexandre Tsybakov's book[3] features an early re-publication of the inequality and its attribution to Bretagnolle and Huber, which is presented as an early and less general version of Assouad's lemma (see notes 2.8). A constant improvement on Bretagnolle–Huber was proved in 2014 as a consequence of an extension of Fano's Inequality.[5]