To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Residual sum of squares

From Wikipedia, the free encyclopedia

In statistics, the residual sum of squares (RSS), also known as the sum of squared residuals (SSR) or the sum of squared estimate of errors (SSE), is the sum of the squares of residuals (deviations predicted from actual empirical values of data). It is a measure of the discrepancy between the data and an estimation model, such as a linear regression. A small RSS indicates a tight fit of the model to the data. It is used as an optimality criterion in parameter selection and model selection.

In general, total sum of squares = explained sum of squares + residual sum of squares. For a proof of this in the multivariate ordinary least squares (OLS) case, see partitioning in the general OLS model.

YouTube Encyclopedic

  • 1/5
    Views:
    304 351
    57 382
    15 320
    99 428
    46 920
  • Introduction to residuals and least squares regression
  • Sum of the residuals squared TI-83 or Ti-84
  • Calculating the Sum of Squares of Residuals
  • Elementary Statistics: Finding the Sum of the Squared Residuals on TI-83-84
  • Excel 2010: Sum of the Squared Residuals

Transcription

One explanatory variable

In a model with a single explanatory variable, RSS is given by:[1]

where yi is the ith value of the variable to be predicted, xi is the ith value of the explanatory variable, and is the predicted value of yi (also termed ). In a standard linear simple regression model, , where and are coefficients, y and x are the regressand and the regressor, respectively, and ε is the error term. The sum of squares of residuals is the sum of squares of ; that is

where is the estimated value of the constant term and is the estimated value of the slope coefficient .

Matrix expression for the OLS residual sum of squares

The general regression model with n observations and k explanators, the first of which is a constant unit vector whose coefficient is the regression intercept, is

where y is an n × 1 vector of dependent variable observations, each column of the n × k matrix X is a vector of observations on one of the k explanators, is a k × 1 vector of true coefficients, and e is an n× 1 vector of the true underlying errors. The ordinary least squares estimator for is

The residual vector ; so the residual sum of squares is:

,

(equivalent to the square of the norm of residuals). In full:

,

where H is the hat matrix, or the projection matrix in linear regression.

Relation with Pearson's product-moment correlation

The least-squares regression line is given by

,

where and , where and

Therefore,

where

The Pearson product-moment correlation is given by therefore,

See also

References

  1. ^ Archdeacon, Thomas J. (1994). Correlation and regression analysis : a historian's guide. University of Wisconsin Press. pp. 161–162. ISBN 0-299-13650-7. OCLC 27266095.
  • Draper, N.R.; Smith, H. (1998). Applied Regression Analysis (3rd ed.). John Wiley. ISBN 0-471-17082-8.
This page was last edited on 1 March 2023, at 08:31
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.