sklearn.linear model.LassoLarsCV
Jump to navigation
Jump to search
A sklearn.linear model.LassoLarsCV is an LASSO-LARS Cross-Validation System within sklearn.linear_model
class.
- Context:
- Usage:
- 1) Import LassoLarsCV model from scikit-learn :
from sklearn.linear_model import LassoLarsCV
- 2) Create design matrix
X
and response vectorY
- 3) Create LassoLarsCV object:
model=LassoLarsCV([fit_intercept=True, verbose=False, max_iter=500, normalize=True, precompute=’auto’, cv=None, ...])
- 4) Choose method(s):
fit(X, y)
, fits the model using X, y as training data.get_params([deep])
, gets parameters for this estimator.predict(X)
, predicts using the linear modelscore(X, y[, sample_weight])
, returns the coefficient of determination R^2 of the prediction.set_params(**params)
, sets the parameters of this estimator.
- 1) Import LassoLarsCV model from scikit-learn :
- Example(s):
- Counter-Example(s):
- See: Regression System, Regressor, Cross-Validation Task, Ridge Regression Task, Bayesian Analysis.
References
2017A
- http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LassoLarsCV.html
- QUOTE:
class sklearn.linear_model.LassoLarsCV(fit_intercept=True, verbose=False, max_iter=500, normalize=True, precompute=’auto’, cv=None, max_n_alphas=1000, n_jobs=1, eps=2.2204460492503131e-16, copy_X=True, positive=False)
- QUOTE:
- Cross-validated Lasso, using the LARS algorithm
- The optimization objective for Lasso is:
- (...)
- The object solves the same problem as the LassoCV object. However, unlike the
LassoCV
, it find the relevant alphas values by itself. In general, because of this property, it will be more stable. However, it is more fragile to heavily multicollinear datasets. - It is more efficient than the
LassoCV
if only a small number of features are selected compared to the total number, for instance if there are very few samples compared to the number of features.
2017B
- (Scikit Learn, 2017) ⇒ http://scikit-learn.org/stable/modules/grid_search.html
- QUOTE: 3.2.4.1. Model specific cross-validation
Some models can fit data for a range of values of some parameter almost as efficiently as fitting the estimator for a single value of the parameter. This feature can be leveraged to perform a more efficient cross-validation used for model selection of this parameter.
The most common parameter amenable to this strategy is the parameter encoding the strength of the regularizer. In this case we say that we compute the regularization path of the estimator.
- QUOTE: 3.2.4.1. Model specific cross-validation
- Here is the list of such models:
::
linear_model.ElasticNetCV([l1_ratio, eps,...])
, Elastic Net model with iterative fitting along a regularization path; linear_model.LarsCV([fit_intercept, ...])
, Cross-validated Least Angle Regression modellinear_model.LassoCV([eps, n_alphas, ...])
, Lasso linear model with iterative fitting along a regularization path;linear_model.LassoLarsCV([fit_intercept, ...])
, Cross-validated Lasso, using the LARS algorithmlinear_model.LogisticRegressionCV([Cs, ...])
, Logistic Regression CV (aka logit, MaxEnt) classifier.linear_model.MultiTaskElasticNetCV([...])
, Multi-task L1/L2 ElasticNet with built-in cross-validation.linear_model.MultiTaskLassoCV([eps, ...])
, Multi-task L1/L2 Lasso with built-in cross-validation.linear_model.OrthogonalMatchingPursuitCV([...])
, Cross-validated Orthogonal Matching Pursuit model (OMP)linear_model.RidgeCV([alphas, ...])
, Ridge regression with built-in cross-validation.linear_model.RidgeClassifierCV([alphas, ...])
, Ridge classifier with built-in cross-validation.
- Here is the list of such models: