To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Least-angle regression

From Wikipedia, the free encyclopedia

Standardized coefficients shown as a function of proportion of shrinkage.

In statistics, least-angle regression (LARS) is an algorithm for fitting linear regression models to high-dimensional data, developed by Bradley Efron, Trevor Hastie, Iain Johnstone and Robert Tibshirani.[1]

Suppose we expect a response variable to be determined by a linear combination of a subset of potential covariates. Then the LARS algorithm provides a means of producing an estimate of which variables to include, as well as their coefficients.

Instead of giving a vector result, the LARS solution consists of a curve denoting the solution for each value of the L1 norm of the parameter vector. The algorithm is similar to forward stepwise regression, but instead of including variables at each step, the estimated parameters are increased in a direction equiangular to each one's correlations with the residual.

YouTube Encyclopedic

  • 1/5
    Views:
    5 751
    38 663
    721
    93 140
    622
  • 30 - Least Angle Regression
  • Lasso Regression
  • 02 Testing a Lasso regression model in SAS
  • Stepwise Regression
  • Regression - R: An Introduction - 6.3

Transcription

Pros and cons

The advantages of the LARS method are:

  1. It is computationally just as fast as forward selection.
  2. It produces a full piecewise linear solution path, which is useful in cross-validation or similar attempts to tune the model.
  3. If two variables are almost equally correlated with the response, then their coefficients should increase at approximately the same rate. The algorithm thus behaves as intuition would suggest, and also is more stable.
  4. It is easily modified to produce efficient algorithms for other methods producing similar results, like the lasso and forward stagewise regression.
  5. It is effective in contexts where p ≫ n (i.e., when the number of predictors p is significantly greater than the number of points n)[2]

The disadvantages of the LARS method include:

  1. With any amount of noise in the dependent variable and with high dimensional multicollinear independent variables, there is no reason to believe that the selected variables will have a high probability of being the actual underlying causal variables. This problem is not unique to LARS, as it is a general problem with variable selection approaches that seek to find underlying deterministic components. Yet, because LARS is based upon an iterative refitting of the residuals, it appears to be especially sensitive to the effects of noise. This problem is discussed in detail by Weisberg in the discussion section of the Efron et al. (2004) Annals of Statistics article.[3] Weisberg provides an empirical example based upon re-analysis of data originally used to validate LARS that the variable selection appears to have problems with highly correlated variables.
  2. Since almost all high dimensional data in the real world will just by chance exhibit some degree of collinearity across at least some variables, the problem that LARS has with correlated variables may limit its application to high dimensional data.

Algorithm

The basic steps of the Least-angle regression algorithm are:

  • Start with all coefficients equal to zero.
  • Find the predictor most correlated with .
  • Increase the coefficient in the direction of the sign of its correlation with . Take residuals along the way. Stop when some other predictor has as much correlation with as has.
  • Increase (, ) in their joint least squares direction, until some other predictor has as much correlation with the residual .
  • Increase (, , ) in their joint least squares direction, until some other predictor has as much correlation with the residual .
  • Continue until: all predictors are in the model.[4]

Software implementation

Least-angle regression is implemented in  R via the lars package, in  Python with the scikit-learn package, and in  SAS via the GLMSELECT procedure.

See also

References

  1. ^ Efron, Bradley; Hastie, Trevor; Johnstone, Iain; Tibshirani, Robert (2004). "Least Angle Regression" (PDF). Annals of Statistics. 32 (2): pp. 407–499. arXiv:math/0406456. doi:10.1214/009053604000000067. MR 2060166. S2CID 204004121.
  2. ^ Hastie, Trevor; Robert, Tibshirani; Jerome, Friedman (2009). The Elements of Statistical Learning Data Mining, Inference, and Prediction (2nd ed. 2009.) (PDF). Springer Series in Statistics. Springer New York. p. 76. doi:10.1007/978-0-387-84858-7. ISBN 978-0-387-84857-0.
  3. ^ See Discussion by Weisberg following Efron, Bradley; Hastie, Trevor; Johnstone, Iain; Tibshirani, Robert (2004). "Least Angle Regression" (PDF). Annals of Statistics. 32 (2): pp. 407–499. arXiv:math/0406456. doi:10.1214/009053604000000067. MR 2060166. S2CID 204004121.
  4. ^ "A simple explanation of the Lasso and Least Angle Regression". Archived from the original on 2015-06-21.
This page was last edited on 17 October 2023, at 13:24
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.