To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

# Method of moments (statistics)

In statistics, the method of moments is a method of estimation of population parameters.

It starts by expressing the population moments (i.e., the expected values of powers of the random variable under consideration) as functions of the parameters of interest. Those expressions are then set equal to the sample moments. The number of such equations is the same as the number of parameters to be estimated. Those equations are then solved for the parameters of interest. The solutions are estimates of those parameters. The method of moments was introduced by Pafnuty Chebyshev in 1887.

• 1/5
Views:
20 222
10 527
30 255
50 419
1 954
• ✪ Method of Moments Estimation
• ✪ 6. Maximum Likelihood Estimation (cont.) and the Method of Moments
• ✪ Method of Moments Problem
• ✪ Moment method estimation: Uniform distribution
• ✪ Method of Moments

#### Transcription

Hey guys, in this video I'm going to do this statistics and probability problem. We are given a continuous function PDF, and we're asked to find the method of moments estimator for theta, which is theta tilde. And from that, we're supposed to find out whether the estimator is unbiased or biased. For these types of method of moments problems, sometimes you're asked to find a specific-- a moment specifically using something like x squared, for example, in which case you probably want to use the second moment. But in this particular problem we're not asked to find any particular moment, so we can just use the first moment. And for these types of problems we're basically going to have that the sample mean is equal to the expected value. We're going to find the expected value of x given our PDF, which we're also going to say is the population. Once we get the expected value, this is going to be equal to ~whatever~ and from the right-hand side we're going to solve for theta. And this is going to be our method of moments estimator. We're going to have something with the sample mean on the left-hand side and theta tilde on the right-hand side, and that's our estimator. Then for the second part, we take the expected value of this and if it equals just theta at the end, then its unbiased. Let's go ahead and do the problem. The first thing we need to find is the expected value of x. The expected value of x is going to be from 0 to theta, which we're given. X times the PDF: We're just doing the first moment, so it's just e to the x. The PDF is a over theta to the a, x to the a minus 1. That's equal to 0 to theta x to the a times a over theta over a. When we integrate that, that's going to be x to the a plus 1 over a plus 1 times a over theta to the a, and this is from 0 to theta. Plugging in theta gives us theta to the a plus 1 times a over a plus 1 times theta to the a. We can cancel out a theta a, which gives us theta a over a plus one. In this case, we want to think of this in terms of the sample, so we're going to say that the sample mean is equal to the expected value, which we got equal to this and if we consider this the estimator, then we're going to solve for that and that's going to give us a plus one times the sample mean over a, and that's our method of moments estimator for theta. The second part of the problem asks whether it's biased or unbiased and if it's biased, whether it overestimates or underestimates. We're going to do the expected value of theta. And something is unbiased if the expected value of the estimate is equal to the population parameter. This would be unbiased if it's true. That's going to be the expected value of this left-hand side. That's a plus one times the sample mean over a. We can pull out a plus one over a because those are constants. This is equal to mu, which we have from the first part, which is just this expected value. That's going to be a theta over a plus one. A plus one and a cancel, so we have that the expected value of a--of theta tilde goes to theta or is equal to theta. In this case, theta tilde is unbiased and we are done with this problem. *chiptune music plays*

## Method

Suppose that the problem is to estimate ${\displaystyle k}$ unknown parameters ${\displaystyle \theta _{1},\theta _{2},\dots ,\theta _{k}}$ characterizing the distribution ${\displaystyle f_{W}(w;\theta )}$ of the random variable ${\displaystyle W}$.[1] Suppose the first ${\displaystyle k}$ moments of the true distribution (the "population moments") can be expressed as functions of the ${\displaystyle \theta }$s:

{\displaystyle {\begin{aligned}\mu _{1}&\equiv \operatorname {E} [W]=g_{1}(\theta _{1},\theta _{2},\ldots ,\theta _{k}),\\[4pt]\mu _{2}&\equiv \operatorname {E} [W^{2}]=g_{2}(\theta _{1},\theta _{2},\ldots ,\theta _{k}),\\&\,\,\,\vdots \\\mu _{k}&\equiv \operatorname {E} [W^{k}]=g_{k}(\theta _{1},\theta _{2},\ldots ,\theta _{k}).\end{aligned}}}

Suppose a sample of size ${\displaystyle n}$ is drawn, resulting in the values ${\displaystyle w_{1},\dots ,w_{n}}$. For ${\displaystyle j=1,\dots ,k}$, let

${\displaystyle {\widehat {\mu }}_{j}={\frac {1}{n}}\sum _{i=1}^{n}w_{i}^{j}}$

be the j-th sample moment, an estimate of ${\displaystyle \mu _{j}}$. The method of moments estimator for ${\displaystyle \theta _{1},\theta _{2},\ldots ,\theta _{k}}$ denoted by ${\displaystyle {\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\dots ,{\widehat {\theta }}_{k}}$ is defined as the solution (if there is one) to the equations:[citation needed]

{\displaystyle {\begin{aligned}{\widehat {\mu }}_{1}&=g_{1}({\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\ldots ,{\widehat {\theta }}_{k}),\\[4pt]{\widehat {\mu }}_{2}&=g_{2}({\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\ldots ,{\widehat {\theta }}_{k}),\\&\,\,\,\vdots \\{\widehat {\mu }}_{k}&=g_{k}({\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\ldots ,{\widehat {\theta }}_{k}).\end{aligned}}}

The method of moments is fairly simple and yields consistent estimators (under very weak assumptions), though these estimators are often biased.

In some respects, when estimating parameters of a known family of probability distributions, this method was superseded by Fisher's method of maximum likelihood, because maximum likelihood estimators have higher probability of being close to the quantities to be estimated and are more often unbiased[citation needed].

However, in some cases the likelihood equations may be intractable without computers, whereas the method-of-moments estimators can be quickly and easily calculated by hand.

Estimates by the method of moments may be used as the first approximation to the solutions of the likelihood equations, and successive improved approximations may then be found by the Newton–Raphson method. In this way the method of moments can assist in finding maximum likelihood estimates.

In some cases, infrequent with large samples but not so infrequent with small samples, the estimates given by the method of moments are outside of the parameter space (as shown in the example below); it does not make sense to rely on them then. That problem never arises in the method of maximum likelihood[citation needed]. Also, estimates by the method of moments are not necessarily sufficient statistics, i.e., they sometimes fail to take into account all relevant information in the sample.

When estimating other structural parameters (e.g., parameters of a utility function, instead of parameters of a known probability distribution), appropriate probability distributions may not be known, and moment-based estimates may be preferred to maximum likelihood estimation.

## Examples

An example application of the method of moments is to estimate polynomial probability density distributions. In this case, an approximate polynomial of order ${\displaystyle N}$ is defined on an interval ${\displaystyle [a,b]}$. The method of moments then yields a system of equations, whose solution involves the inversion of a Hankel matrix.[2]

### Uniform distribution

Consider the uniform distribution on the interval ${\displaystyle [a,b]}$, ${\displaystyle U(a,b)}$. If ${\displaystyle W\sim U(a,b)}$ then we have

${\displaystyle \mu _{1}=\operatorname {E} [W]={\frac {1}{2}}(a+b)}$
${\displaystyle \mu _{2}=\operatorname {E} [W^{2}]={\frac {1}{3}}(a^{2}+ab+b^{2})}$

Solving these equations gives

${\displaystyle {\widehat {a}}=\mu _{1}\pm {\sqrt {3\left(\mu _{2}-\mu _{1}^{2}\right)}}}$
${\displaystyle {\widehat {b}}=2\mu _{1}-a}$

Given a set of samples ${\displaystyle \{w_{i}\}}$ we can use the sample moments ${\displaystyle {\widehat {\mu }}_{1}}$ and ${\displaystyle {\widehat {\mu }}_{2}}$ in these formulae in order to estimate ${\displaystyle a}$ and ${\displaystyle b}$.

Note, however, that this method can produce inconsistent results in some cases. For example, the set of samples ${\displaystyle \{0,0,0,0,1\}}$ results in the estimate ${\displaystyle {\widehat {a}}={\frac {1}{5}}-{\frac {2{\sqrt {3}}}{5}},{\widehat {b}}={\frac {1}{5}}+{\frac {2{\sqrt {3}}}{5}}}$ even though ${\displaystyle b<1}$ and so it is impossible for the set ${\displaystyle \{0,0,0,0,1\}}$ to have been drawn from ${\displaystyle U({\widehat {a}},{\widehat {b}})}$ in this case.