To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Mean squared prediction error

From Wikipedia, the free encyclopedia

In statistics the mean squared prediction error (MSPE), also known as mean squared error of the predictions, of a smoothing, curve fitting, or regression procedure is the expected value of the squared prediction errors (PE), the square difference between the fitted values implied by the predictive function and the values of the (unobservable) true value g. It is an inverse measure of the explanatory power of and can be used in the process of cross-validation of an estimated model. Knowledge of g would be required in order to calculate the MSPE exactly; in practice, MSPE is estimated.[1]

YouTube Encyclopedic

  • 1/3
    Views:
    5 284
    288 547
    165 507
  • Model Fitness - Mean Square Error(Test & Train error)
  • Squared error of regression line | Regression | Probability and Statistics | Khan Academy
  • Standard Error of the Estimate used in Regression Analysis (Mean Square Error)

Transcription

Formulation

If the smoothing or fitting procedure has projection matrix (i.e., hat matrix) L, which maps the observed values vector to predicted values vector then PE and MSPE are formulated as:

The MSPE can be decomposed into two terms: the squared bias (mean error) of the fitted values and the variance of the fitted values:

The quantity SSPE=nMSPE is called sum squared prediction error. The root mean squared prediction error is the square root of MSPE: RMSPE=MSPE.

Computation of MSPE over out-of-sample data

The mean squared prediction error can be computed exactly in two contexts. First, with a data sample of length n, the data analyst may run the regression over only q of the data points (with q < n), holding back the other n – q data points with the specific purpose of using them to compute the estimated model’s MSPE out of sample (i.e., not using data that were used in the model estimation process). Since the regression process is tailored to the q in-sample points, normally the in-sample MSPE will be smaller than the out-of-sample one computed over the n – q held-back points. If the increase in the MSPE out of sample compared to in sample is relatively slight, that results in the model being viewed favorably. And if two models are to be compared, the one with the lower MSPE over the n – q out-of-sample data points is viewed more favorably, regardless of the models’ relative in-sample performances. The out-of-sample MSPE in this context is exact for the out-of-sample data points that it was computed over, but is merely an estimate of the model’s MSPE for the mostly unobserved population from which the data were drawn.

Second, as time goes on more data may become available to the data analyst, and then the MSPE can be computed over these new data.

Estimation of MSPE over the population

When the model has been estimated over all available data with none held back, the MSPE of the model over the entire population of mostly unobserved data can be estimated as follows.

For the model where , one may write

Using in-sample data values, the first term on the right side is equivalent to

Thus,

If is known or well-estimated by , it becomes possible to estimate MSPE by

Colin Mallows advocated this method in the construction of his model selection statistic Cp, which is a normalized version of the estimated MSPE:

where p the number of estimated parameters p and is computed from the version of the model that includes all possible regressors. That concludes this proof.

See also

References

  1. ^ Pindyck, Robert S.; Rubinfeld, Daniel L. (1991). "Forecasting with Time-Series Models". Econometric Models & Economic Forecasts (3rd ed.). New York: McGraw-Hill. pp. 516–535. ISBN 0-07-050098-3.
This page was last edited on 17 March 2023, at 20:21
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.