To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time. 4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds # Wrapped normal distribution

Parameters Probability density function The support is chosen to be [-π,π] with μ=0 Cumulative distribution function The support is chosen to be [-π,π] with μ=0 $\mu$ real$\sigma >0$ $\theta \in$ any interval of length 2π ${\frac {1}{2\pi }}\vartheta \left({\frac {\theta -\mu }{2\pi }},{\frac {i\sigma ^{2}}{2\pi }}\right)$ $\mu$ if support is on interval $\mu \pm \pi$ $\mu$ if support is on interval $\mu \pm \pi$ $\mu$ $1-e^{-\sigma ^{2}/2}$ (circular) (see text) $e^{-\sigma ^{2}n^{2}/2+in\mu }$ In probability theory and directional statistics, a wrapped normal distribution is a wrapped probability distribution that results from the "wrapping" of the normal distribution around the unit circle. It finds application in the theory of Brownian motion and is a solution to the heat equation for periodic boundary conditions. It is closely approximated by the von Mises distribution, which, due to its mathematical simplicity and tractability, is the most commonly used distribution in directional statistics.

• 1/5
Views:
77 541
317 143
1 097
2 883
4 816
• ✪ Casio fx-9750GII - Calculations for the Normal Distribution
• ✪ Calculating a Cumulative Distribution Function (CDF)
• ✪ Adding beyond twenty | Elementary Mathematics (K-6) Explained 28 | N J Wildberger
• ✪ Probability Distribution Quiz - Georgia Tech - Machine Learning
• ✪ Calculating the Probabilities Around Stock Moves | Skinny on Options: Data Science

## Definition

The probability density function of the wrapped normal distribution is

$f_{WN}(\theta ;\mu ,\sigma )={\frac {1}{\sigma {\sqrt {2\pi }}}}\sum _{k=-\infty }^{\infty }\exp \left[{\frac {-(\theta -\mu +2\pi k)^{2}}{2\sigma ^{2}}}\right],$ where μ and σ are the mean and standard deviation of the unwrapped distribution, respectively. Expressing the above density function in terms of the characteristic function of the normal distribution yields:

$f_{WN}(\theta ;\mu ,\sigma )={\frac {1}{2\pi }}\sum _{n=-\infty }^{\infty }e^{-\sigma ^{2}n^{2}/2+in(\theta -\mu )}={\frac {1}{2\pi }}\vartheta \left({\frac {\theta -\mu }{2\pi }},{\frac {i\sigma ^{2}}{2\pi }}\right),$ where $\vartheta (\theta ,\tau )$ is the Jacobi theta function, given by

$\vartheta (\theta ,\tau )=\sum _{n=-\infty }^{\infty }(w^{2})^{n}q^{n^{2}}{\text{ where }}w\equiv e^{i\pi \theta }$ and $q\equiv e^{i\pi \tau }.$ The wrapped normal distribution may also be expressed in terms of the Jacobi triple product:

$f_{WN}(\theta ;\mu ,\sigma )={\frac {1}{2\pi }}\prod _{n=1}^{\infty }(1-q^{n})(1+q^{n-1/2}z)(1+q^{n-1/2}/z).$ where $z=e^{i(\theta -\mu )}\,$ and $q=e^{-\sigma ^{2}}.$ ## Moments

In terms of the circular variable $z=e^{i\theta }$ the circular moments of the wrapped normal distribution are the characteristic function of the normal distribution evaluated at integer arguments:

$\langle z^{n}\rangle =\int _{\Gamma }e^{in\theta }\,f_{WN}(\theta ;\mu ,\sigma )\,d\theta =e^{in\mu -n^{2}\sigma ^{2}/2}.$ where $\Gamma \,$ is some interval of length $2\pi$ . The first moment is then the average value of z, also known as the mean resultant, or mean resultant vector:

$\langle z\rangle =e^{i\mu -\sigma ^{2}/2}$ The mean angle is

$\theta _{\mu }=\mathrm {Arg} \langle z\rangle =\mu$ and the length of the mean resultant is

$R=|\langle z\rangle |=e^{-\sigma ^{2}/2}$ The circular standard deviation, which is a useful measure of dispersion for the wrapped normal distribution and its close relative, the von Mises distribution is given by:

$s=\ln(R^{-2})^{1/2}=\sigma$ ## Estimation of parameters

A series of N measurements zn = e n drawn from a wrapped normal distribution may be used to estimate certain parameters of the distribution. The average of the series z is defined as

${\overline {z}}={\frac {1}{N}}\sum _{n=1}^{N}z_{n}$ and its expectation value will be just the first moment:

$\langle {\overline {z}}\rangle =e^{i\mu -\sigma ^{2}/2}.\,$ In other words, z is an unbiased estimator of the first moment. If we assume that the mean μ lies in the interval [−ππ), then Arg z will be a (biased) estimator of the mean μ.

Viewing the zn as a set of vectors in the complex plane, the R2 statistic is the square of the length of the averaged vector:

${\overline {R}}^{2}={\overline {z}}\,{\overline {z^{*}}}=\left({\frac {1}{N}}\sum _{n=1}^{N}\cos \theta _{n}\right)^{2}+\left({\frac {1}{N}}\sum _{n=1}^{N}\sin \theta _{n}\right)^{2}\,$ and its expected value is:

$\left\langle {\overline {R}}^{2}\right\rangle ={\frac {1}{N}}+{\frac {N-1}{N}}\,e^{-\sigma ^{2}}\,$ In other words, the statistic

$R_{e}^{2}={\frac {N}{N-1}}\left({\overline {R}}^{2}-{\frac {1}{N}}\right)$ will be an unbiased estimator of eσ2, and ln(1/Re2) will be a (biased) estimator of σ2

## Entropy

The information entropy of the wrapped normal distribution is defined as:

$H=-\int _{\Gamma }f_{WN}(\theta ;\mu ,\sigma )\,\ln(f_{WN}(\theta ;\mu ,\sigma ))\,d\theta$ where $\Gamma$ is any interval of length $2\pi$ . Defining $z=e^{i(\theta -\mu )}$ and $q=e^{-\sigma ^{2}}$ , the Jacobi triple product representation for the wrapped normal is:

$f_{WN}(\theta ;\mu ,\sigma )={\frac {\phi (q)}{2\pi }}\prod _{m=1}^{\infty }(1+q^{m-1/2}z)(1+q^{m-1/2}z^{-1})$ where $\phi (q)\,$ is the Euler function. The logarithm of the density of the wrapped normal distribution may be written:

$\ln(f_{WN}(\theta ;\mu ,\sigma ))=\ln \left({\frac {\phi (q)}{2\pi }}\right)+\sum _{m=1}^{\infty }\ln(1+q^{m-1/2}z)+\sum _{m=1}^{\infty }\ln(1+q^{m-1/2}z^{-1})$ Using the series expansion for the logarithm:

$\ln(1+x)=-\sum _{k=1}^{\infty }{\frac {(-1)^{k}}{k}}\,x^{k}$ the logarithmic sums may be written as:

$\sum _{m=1}^{\infty }\ln(1+q^{m-1/2}z^{\pm 1})=-\sum _{m=1}^{\infty }\sum _{k=1}^{\infty }{\frac {(-1)^{k}}{k}}\,q^{mk-k/2}z^{\pm k}=-\sum _{k=1}^{\infty }{\frac {(-1)^{k}}{k}}\,{\frac {q^{k/2}}{1-q^{k}}}\,z^{\pm k}$ so that the logarithm of density of the wrapped normal distribution may be written as:

$\ln(f_{WN}(\theta ;\mu ,\sigma ))=\ln \left({\frac {\phi (q)}{2\pi }}\right)-\sum _{k=1}^{\infty }{\frac {(-1)^{k}}{k}}{\frac {q^{k/2}}{1-q^{k}}}\,(z^{k}+z^{-k})$ which is essentially a Fourier series in $\theta \,$ . Using the characteristic function representation for the wrapped normal distribution in the left side of the integral:

$f_{WN}(\theta ;\mu ,\sigma )={\frac {1}{2\pi }}\sum _{n=-\infty }^{\infty }q^{n^{2}/2}\,z^{n}$ the entropy may be written:

$H=-\ln \left({\frac {\phi (q)}{2\pi }}\right)+{\frac {1}{2\pi }}\int _{\Gamma }\left(\sum _{n=-\infty }^{\infty }\sum _{k=1}^{\infty }{\frac {(-1)^{k}}{k}}{\frac {q^{(n^{2}+k)/2}}{1-q^{k}}}\left(z^{n+k}+z^{n-k}\right)\right)\,d\theta$ which may be integrated to yield:

$H=-\ln \left({\frac {\phi (q)}{2\pi }}\right)+2\sum _{k=1}^{\infty }{\frac {(-1)^{k}}{k}}\,{\frac {q^{(k^{2}+k)/2}}{1-q^{k}}}$ 