Least-Angle Regression Algorithm
(Redirected from Least-Angle Regression)
Jump to navigation
Jump to search
A Least-Angle Regression Algorithm is a linear regression model selection algorithm that ...
- AKA: LARS Algorithm.
- Context:
- It can be implemented in a LARS System (to solve a LARS task).
- See: Lasso Regression, Chi-Squared Statistic, Linear Regression, Model Selection Algorithm, L1 Norm, Stepwise Regression, Cross-Validation (Statistics), Selective Inference.
References
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/least-angle_regression Retrieved:2015-7-20.
- In statistics, least-angle regression (LARS) is an for fitting linear regression models to high-dimensional data, developed by Bradley Efron, Trevor Hastie, Iain Johnstone and Robert Tibshirani. Suppose we expect a response variable to be determined by a linear combination of a subset of potential covariates. Then the LARS algorithm provides a means of producing an estimate of which variables to include, as well as their coefficients. Instead of giving a vector result, the LARS solution consists of a curve denoting the solution for each value of the L1 norm of the parameter vector. The algorithm is similar to forward stepwise regression, but instead of including variables at each step, the estimated parameters are increased in a direction equiangular to each one's correlations with the residual. The advantages of the LARS method are: # It is computationally just as fast as forward selection.
- It produces a full piecewise linear solution path, which is useful in cross-validation or similar attempts to tune the model.
- If two variables are almost equally correlated with the response, then their coefficients should increase at approximately the same rate. The algorithm thus behaves as intuition would expect, and also is more stable.
- It is easily modified to produce solutions for other estimators, like the lasso.
- It is effective in contexts where p >> n (IE, when the number of dimensions is significantly greater than the number of points).
- The disadvantages of the LARS method include:
- With any amount of noise in the dependent variable and with high dimensional multicollinear independent variables, there is no reason to believe that the selected variables will have a high probability of being the actual underlying causal variables. This problem is not unique to LARS, as it is a general problem with variable selection approaches that seek to find underlying deterministic components. Yet, because LARS is based upon an iterative refitting of the residuals, it would appear to be especially sensitive to the effects of noise. This problem is discussed in detail by Weisberg in the discussion section of the Efron et al. (2004) Annals of Statistics article. [1] Weisberg provides an empirical example based upon re-analysis of data originally used to validate LARS that the variable selection appears to have problems with highly correlated variables.
- Since almost all high dimensional data in the real world will just by chance exhibit some fair degree of collinearity across at least some variables, the problem that LARS has with correlated variables may limit its application to high dimensional data.
- In statistics, least-angle regression (LARS) is an for fitting linear regression models to high-dimensional data, developed by Bradley Efron, Trevor Hastie, Iain Johnstone and Robert Tibshirani. Suppose we expect a response variable to be determined by a linear combination of a subset of potential covariates. Then the LARS algorithm provides a means of producing an estimate of which variables to include, as well as their coefficients. Instead of giving a vector result, the LARS solution consists of a curve denoting the solution for each value of the L1 norm of the parameter vector. The algorithm is similar to forward stepwise regression, but instead of including variables at each step, the estimated parameters are increased in a direction equiangular to each one's correlations with the residual. The advantages of the LARS method are: # It is computationally just as fast as forward selection.
- ↑ See Discussion by Weisberg following
2011
- http://www.quora.com/What-is-Least-Angle-Regression-and-when-should-it-be-used
- QUOTE: Least Angle Regression (aka LARS) is a model selection method for linear regression (when you're worried about overfitting or want your model to be easily interpretable).
2004
- (Efron et al., 2004) ⇒ Bradley Efron, Trevor Hastie, Iain Johnstone, and Robert Tibshirani. (2004). “Least Angle Regression." The Annals of statistics 32, no. 2