To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Variance-stabilizing transformation

From Wikipedia, the free encyclopedia

In applied statistics, a variance-stabilizing transformation is a data transformation that is specifically chosen either to simplify considerations in graphical exploratory data analysis or to allow the application of simple regression-based or analysis of variance techniques.[1]

YouTube Encyclopedic

  • 1/2
    Views:
    1 173
    679
  • Mod-01 Lec-21 Lecture-21-Transformations and Weighting to correct model inadequacies
  • C14P2

Transcription

Overview

The aim behind the choice of a variance-stabilizing transformation is to find a simple function ƒ to apply to values x in a data set to create new values y = ƒ(x) such that the variability of the values y is not related to their mean value. For example, suppose that the values x are realizations from different Poisson distributions: i.e. the distributions each have different mean values μ. Then, because for the Poisson distribution the variance is identical to the mean, the variance varies with the mean. However, if the simple variance-stabilizing transformation

is applied, the sampling variance associated with observation will be nearly constant: see Anscombe transform for details and some alternative transformations.

While variance-stabilizing transformations are well known for certain parametric families of distributions, such as the Poisson and the binomial distribution, some types of data analysis proceed more empirically: for example by searching among power transformations to find a suitable fixed transformation. Alternatively, if data analysis suggests a functional form for the relation between variance and mean, this can be used to deduce a variance-stabilizing transformation.[2] Thus if, for a mean μ,

a suitable basis for a variance stabilizing transformation would be

where the arbitrary constant of integration and an arbitrary scaling factor can be chosen for convenience.

Example: relative variance

If X is a positive random variable and the variance is given as h(μ) = s2μ2 then the standard deviation is proportional to the mean, which is called fixed relative error. In this case, the variance-stabilizing transformation is

That is, the variance-stabilizing transformation is the logarithmic transformation.

Example: absolute plus relative variance

If the variance is given as h(μ) = σ2 + s2μ2 then the variance is dominated by a fixed variance σ2 when |μ| is small enough and is dominated by the relative variance s2μ2 when |μ| is large enough. In this case, the variance-stabilizing transformation is

That is, the variance-stabilizing transformation is the inverse hyperbolic sine of the scaled value x / λ for λ = σ / s.

Relationship to the delta method

Here, the delta method is presented in a rough way, but it is enough to see the relation with the variance-stabilizing transformations. To see a more formal approach see delta method.

Let be a random variable, with and . Define , where is a regular function. A first order Taylor approximation for is:

From the equation above, we obtain:

and

This approximation method is called delta method.

Consider now a random variable such that and . Notice the relation between the variance and the mean, which implies, for example, heteroscedasticity in a linear model. Therefore, the goal is to find a function such that has a variance independent (at least approximately) of its expectation.

Imposing the condition , this equality implies the differential equation:

This ordinary differential equation has, by separation of variables, the following solution:

This last expression appeared for the first time in a M. S. Bartlett paper.[3]

References

  1. ^ Everitt, B. S. (2002). The Cambridge Dictionary of Statistics (2nd ed.). CUP. ISBN 0-521-81099-X.
  2. ^ Dodge, Y. (2003). The Oxford Dictionary of Statistical Terms. OUP. ISBN 0-19-920613-9.
  3. ^ Bartlett, M. S. (1947). "The Use of Transformations". Biometrics. 3: 39–52. doi:10.2307/3001536.
This page was last edited on 31 October 2023, at 11:39
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.