sklearn.linear model.LassoCV
Jump to navigation
Jump to search
A sklearn.linear model.LassoCV is an LASSO Cross-Validation System within sklearn.linear_model
class.
- Context:
- Usage:
- 1) Import LassoCV model from scikit-learn :
from sklearn.linear_model import LassoCV
- 2) Create design matrix
X
and response vectorY
- 3) Create LassoCV object:
model=LassoCV([eps=0.001, n_alphas=100, alphas=None, fit_intercept=True, normalize=False, precompute=’auto’, max_iter=1000, tol=0.0001,...])
- 4) Choose method(s):
fit(X, y)
, fits linear model with coordinate descent;get_params([deep])
, gets parameters for this estimator;path(X, y[, eps, n_alphas, alphas, ...])
, computes Lasso path with coordinate descent;predict(X)
, predicts using the linear model;score(X, y[, sample_weight])
, returns the coefficient of determination R^2 of the prediction.set_params(**params)
, sets the parameters of this estimator.
- 1) Import LassoCV model from scikit-learn :
- Example(s):
- Counter-Example(s):
- See: Regression System, Regressor, Cross-Validation Task, Ridge Regression Task, Bayesian Analysis.
References
2017A
- http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LassoCV.html
- QUOTE:
class sklearn.linear_model.LassoCV(eps=0.001, n_alphas=100, alphas=None, fit_intercept=True, normalize=False, precompute=’auto’, max_iter=1000, tol=0.0001, copy_X=True, cv=None, verbose=False, n_jobs=1, positive=False, random_state=None, selection=’cyclic’)
- QUOTE:
- Lasso linear model with iterative fitting along a regularization path
- The best model is selected by cross-validation.
- The optimization objective for Lasso is:
(1 / (2 * n_samples)) * ||y - Xw||^2_2 + alpha * ||w||_1
2017B
- (Scikit Learn, 2017) ⇒ http://scikit-learn.org/stable/modules/grid_search.html
- QUOTE: 3.2.4.1. Model specific cross-validation
Some models can fit data for a range of values of some parameter almost as efficiently as fitting the estimator for a single value of the parameter. This feature can be leveraged to perform a more efficient cross-validation used for model selection of this parameter.
The most common parameter amenable to this strategy is the parameter encoding the strength of the regularizer. In this case we say that we compute the regularization path of the estimator.
- QUOTE: 3.2.4.1. Model specific cross-validation
- Here is the list of such models:
::
linear_model.ElasticNetCV([l1_ratio, eps,...])
, Elastic Net model with iterative fitting along a regularization path; linear_model.LarsCV([fit_intercept, ...])
, Cross-validated Least Angle Regression modellinear_model.LassoCV([eps, n_alphas, ...])
, Lasso linear model with iterative fitting along a regularization path;linear_model.LassoLarsCV([fit_intercept, ...])
, Cross-validated Lasso, using the LARS algorithmlinear_model.LogisticRegressionCV([Cs, ...])
, Logistic Regression CV (aka logit, MaxEnt) classifier.linear_model.MultiTaskElasticNetCV([...])
, Multi-task L1/L2 ElasticNet with built-in cross-validation.linear_model.MultiTaskLassoCV([eps, ...])
, Multi-task L1/L2 Lasso with built-in cross-validation.linear_model.OrthogonalMatchingPursuitCV([...])
, Cross-validated Orthogonal Matching Pursuit model (OMP)linear_model.RidgeCV([alphas, ...])
, Ridge regression with built-in cross-validation.linear_model.RidgeClassifierCV([alphas, ...])
, Ridge classifier with built-in cross-validation.
- Here is the list of such models: