sklearn.linear model.LassoLarsIC
Jump to navigation
Jump to search
A sklearn.linear model.LassoLarsIC is an LASSO-LARS Information Criteria System within sklearn.linear_model
class.
- Context:
- Usage:
- 1) Import LassoLarsIC model from scikit-learn :
from sklearn.linear_model import LassoLarsIC
- 2) Create design matrix
X
and response vectorY
- 3) Create LassoLarsIC object:
model=LassoLarsIC([criterion=’aic’, fit_intercept=True, verbose=False, normalize=True, precompute=’auto’, max_iter=500, eps=2.2204460492503131e-16, copy_X=True, positive=False, ...])
- 4) Choose method(s):
fit(X, y[, copy_X])
, fits the model using X, y as training data.get_params([deep])
, gets parameters for this estimator.predict(X)
, predicts using the linear modelscore(X, y[, sample_weight])
, returns the coefficient of determination R^2 of the prediction.set_params(**params)
, sets the parameters of this estimator.
- 1) Import LassoLarsIC model from scikit-learn :
- Example(s):
- Counter-Example(s):
- See: Regression System, Regressor, Regularization Task, Cross-Validation Task, Akaike Information Criterion, Bayes Information Criterion, Bayesian Analysis.
References
2017A
- (Scikit-Learn) ⇒ http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LassoLarsIC.html
- QUOTE:
class sklearn.linear_model.LassoLarsIC(criterion=’aic’, fit_intercept=True, verbose=False, normalize=True, precompute=’auto’, max_iter=500, eps=2.2204460492503131e-16, copy_X=True, positive=False)
- QUOTE:
- Lasso model fit with Lars using BIC or AIC for model selection
- The optimization objective for Lasso is:
(1 / (2 * n_samples)) * ||y - Xw||^2_2 + alpha * ||w||_1
- AIC is the Akaike information criterion and BIC is the Bayes Information criterion. Such criteria are useful to select the value of the regularization parameter by making a trade-off between the goodness of fit and the complexity of the model. A good model should explain well the data while being simple.
2017B
- (Scikit Learn) ⇒ "1.1.8. LARS Lasso" http://scikit-learn.org/stable/modules/linear_model.html#lars-lasso Retrieved:2017-11-05
- QUOTE:
LassoLars
is a lasso model implemented using the LARS algorithm, and unlike the implementation based on coordinate_descent, this yields the exact solution, which is piecewise linear as a function of the norm of its coefficients.
- QUOTE:
- (...)
- The algorithm is similar to forward stepwise regression, but instead of including variables at each step, the estimated parameters are increased in a direction equiangular to each one’s correlations with the residual.
- Instead of giving a vector result, the LARS solution consists of a curve denoting the solution for each value of the L1 norm of the parameter vector. The full coefficients path is stored in the array
coef_path_
, which has size(n_features, max_features+1)
. The first column is always zero.