In statistics, the method of moments is a method of estimation of population parameters.
It starts by expressing the population moments (i.e., the expected values of powers of the random variable under consideration) as functions of the parameters of interest. Those expressions are then set equal to the sample moments. The number of such equations is the same as the number of parameters to be estimated. Those equations are then solved for the parameters of interest. The solutions are estimates of those parameters. The method of moments was introduced by Pafnuty Chebyshev in 1887.
YouTube Encyclopedic

1/5Views:20 22210 52730 25550 4191 954

✪ Method of Moments Estimation

✪ 6. Maximum Likelihood Estimation (cont.) and the Method of Moments

✪ Method of Moments Problem

✪ Moment method estimation: Uniform distribution

✪ Method of Moments
Transcription
Hey guys, in this video I'm going to do this statistics and probability problem. We are given a continuous function PDF, and we're asked to find the method of moments estimator for theta, which is theta tilde. And from that, we're supposed to find out whether the estimator is unbiased or biased. For these types of method of moments problems, sometimes you're asked to find a specific a moment specifically using something like x squared, for example, in which case you probably want to use the second moment. But in this particular problem we're not asked to find any particular moment, so we can just use the first moment. And for these types of problems we're basically going to have that the sample mean is equal to the expected value. We're going to find the expected value of x given our PDF, which we're also going to say is the population. Once we get the expected value, this is going to be equal to ~whatever~ and from the righthand side we're going to solve for theta. And this is going to be our method of moments estimator. We're going to have something with the sample mean on the lefthand side and theta tilde on the righthand side, and that's our estimator. Then for the second part, we take the expected value of this and if it equals just theta at the end, then its unbiased. Let's go ahead and do the problem. The first thing we need to find is the expected value of x. The expected value of x is going to be from 0 to theta, which we're given. X times the PDF: We're just doing the first moment, so it's just e to the x. The PDF is a over theta to the a, x to the a minus 1. That's equal to 0 to theta x to the a times a over theta over a. When we integrate that, that's going to be x to the a plus 1 over a plus 1 times a over theta to the a, and this is from 0 to theta. Plugging in theta gives us theta to the a plus 1 times a over a plus 1 times theta to the a. We can cancel out a theta a, which gives us theta a over a plus one. In this case, we want to think of this in terms of the sample, so we're going to say that the sample mean is equal to the expected value, which we got equal to this and if we consider this the estimator, then we're going to solve for that and that's going to give us a plus one times the sample mean over a, and that's our method of moments estimator for theta. The second part of the problem asks whether it's biased or unbiased and if it's biased, whether it overestimates or underestimates. We're going to do the expected value of theta. And something is unbiased if the expected value of the estimate is equal to the population parameter. This would be unbiased if it's true. That's going to be the expected value of this lefthand side. That's a plus one times the sample mean over a. We can pull out a plus one over a because those are constants. This is equal to mu, which we have from the first part, which is just this expected value. That's going to be a theta over a plus one. A plus one and a cancel, so we have that the expected value of aof theta tilde goes to theta or is equal to theta. In this case, theta tilde is unbiased and we are done with this problem. *chiptune music plays*
Contents
Method
Suppose that the problem is to estimate unknown parameters characterizing the distribution of the random variable .^{[1]} Suppose the first moments of the true distribution (the "population moments") can be expressed as functions of the s:
Suppose a sample of size is drawn, resulting in the values . For , let
be the jth sample moment, an estimate of . The method of moments estimator for denoted by is defined as the solution (if there is one) to the equations:^{[citation needed]}
Advantages and disadvantages
The method of moments is fairly simple and yields consistent estimators (under very weak assumptions), though these estimators are often biased.
In some respects, when estimating parameters of a known family of probability distributions, this method was superseded by Fisher's method of maximum likelihood, because maximum likelihood estimators have higher probability of being close to the quantities to be estimated and are more often unbiased^{[citation needed]}.
However, in some cases the likelihood equations may be intractable without computers, whereas the methodofmoments estimators can be quickly and easily calculated by hand.
Estimates by the method of moments may be used as the first approximation to the solutions of the likelihood equations, and successive improved approximations may then be found by the Newton–Raphson method. In this way the method of moments can assist in finding maximum likelihood estimates.
In some cases, infrequent with large samples but not so infrequent with small samples, the estimates given by the method of moments are outside of the parameter space (as shown in the example below); it does not make sense to rely on them then. That problem never arises in the method of maximum likelihood^{[citation needed]}. Also, estimates by the method of moments are not necessarily sufficient statistics, i.e., they sometimes fail to take into account all relevant information in the sample.
When estimating other structural parameters (e.g., parameters of a utility function, instead of parameters of a known probability distribution), appropriate probability distributions may not be known, and momentbased estimates may be preferred to maximum likelihood estimation.
Examples
An example application of the method of moments is to estimate polynomial probability density distributions. In this case, an approximate polynomial of order is defined on an interval . The method of moments then yields a system of equations, whose solution involves the inversion of a Hankel matrix.^{[2]}
Uniform distribution
Consider the uniform distribution on the interval , . If then we have
Solving these equations gives
Given a set of samples we can use the sample moments and in these formulae in order to estimate and .
Note, however, that this method can produce inconsistent results in some cases. For example, the set of samples results in the estimate even though and so it is impossible for the set to have been drawn from in this case.
See also
References
 ^ K. O. Bowman and L. R. Shenton, "Estimator: Method of Moments", pp 2092–2098, Encyclopedia of statistical sciences, Wiley (1998).
 ^ J. Munkhammar, L. Mattsson, J. Rydén (2017) "Polynomial probability distribution estimation using the method of moments". PLoS ONE 12(4): e0174573. https://doi.org/10.1371/journal.pone.0174573