To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Gauss–Kuzmin distribution

Parameters (none) ${\displaystyle k\in \{1,2,\ldots \}}$ ${\displaystyle -\log _{2}\left[1-{\frac {1}{(k+1)^{2}}}\right]}$ ${\displaystyle 1-\log _{2}\left({\frac {k+2}{k+1}}\right)}$ ${\displaystyle +\infty }$ ${\displaystyle 2\,}$ ${\displaystyle 1\,}$ ${\displaystyle +\infty }$ (not defined) (not defined) 3.432527514776...[1][2][3]

In mathematics, the Gauss–Kuzmin distribution is a discrete probability distribution that arises as the limit probability distribution of the coefficients in the continued fraction expansion of a random variable uniformly distributed in (0, 1).[4] The distribution is named after Carl Friedrich Gauss, who derived it around 1800,[5] and Rodion Kuzmin, who gave a bound on the rate of convergence in 1929.[6][7] It is given by the probability mass function

${\displaystyle p(k)=-\log _{2}\left(1-{\frac {1}{(1+k)^{2}}}\right)~.}$

Gauss–Kuzmin theorem

Let

${\displaystyle x={\frac {1}{k_{1}+{\frac {1}{k_{2}+\cdots }}}}}$

be the continued fraction expansion of a random number x uniformly distributed in (0, 1). Then

${\displaystyle \lim _{n\to \infty }\mathbb {P} \left\{k_{n}=k\right\}=-\log _{2}\left(1-{\frac {1}{(k+1)^{2}}}\right)~.}$

Equivalently, let

${\displaystyle x_{n}={\frac {1}{k_{n+1}+{\frac {1}{k_{n+2}+\cdots }}}}~;}$

then

${\displaystyle \Delta _{n}(s)=\mathbb {P} \left\{x_{n}\leq s\right\}-\log _{2}(1+s)}$

tends to zero as n tends to infinity.

Rate of convergence

In 1928, Kuzmin gave the bound

${\displaystyle |\Delta _{n}(s)|\leq C\exp(-\alpha {\sqrt {n}})~.}$

In 1929, Paul Lévy[8] improved it to

${\displaystyle |\Delta _{n}(s)|\leq C\,0.7^{n}~.}$

Later, Eduard Wirsing showed[9] that, for λ=0.30366... (the Gauss-Kuzmin-Wirsing constant), the limit

${\displaystyle \Psi (s)=\lim _{n\to \infty }{\frac {\Delta _{n}(s)}{(-\lambda )^{n}}}}$

exists for every s in [0, 1], and the function Ψ(s) is analytic and satisfies Ψ(0)=Ψ(1)=0. Further bounds were proved by K.I.Babenko.[10]

References

1. ^ Blachman, N. (1984). "The continued fraction as an information source (Corresp.)". IEEE Transactions on Information Theory. 30 (4): 671–674. doi:10.1109/TIT.1984.1056924.
2. ^ Kornerup, Peter; Matula, David W. (July 1995). LCF: A lexicographic binary representation of the rationals. Journal of Universal Computer Science. 1. pp. 484–503. CiteSeerX 10.1.1.108.5117. doi:10.1007/978-3-642-80350-5_41. ISBN 978-3-642-80352-9.
3. ^ Vepstas, L. (2008), Entropy of Continued Fractions (Gauss-Kuzmin Entropy) (PDF)
4. ^
5. ^ Gauss, Johann Carl Friedrich. Werke Sammlung. 10/1. pp. 552–556.
6. ^ Kuzmin, R. O. (1928). "On a problem of Gauss". Dokl. Akad. Nauk SSSR: 375–380.
7. ^ Kuzmin, R. O. (1932). "On a problem of Gauss". Atti del Congresso Internazionale dei Matematici, Bologna. 6: 83–89.
8. ^
9. ^ Wirsing, E. (1974). "On the theorem of Gauss–Kusmin–Lévy and a Frobenius-type theorem for function spaces". Acta Arithmetica. 24 (5): 507–528. doi:10.4064/aa-24-5-507-528.
10. ^ Babenko, K. I. (1978). "On a problem of Gauss". Soviet Math. Dokl. 19: 136–140.