sklearn.linear model.LassoLars

From GM-RKB
Jump to navigation Jump to search

A sklearn.linear model.LassoLars is an LASSO-LARS System‎ within sklearn.linear_model class.

  • Context:
    • Usage:
1) Import LassoLars model from scikit-learn : from sklearn.linear_model import LassoLars
2) Create design matrix X and response vector Y
3) Create LassoLars object: model=LassoLars([alpha=1.0, fit_intercept=True, verbose=False, normalize=True, precompute=’auto’,...])
4) Choose method(s):
  • fit(X, y[, Xy]), fits the model using X, y as training data.
  • get_params([deep]), gets parameters for this estimator.
  • predict(X), predicts using the linear model
  • score(X, y[, sample_weight]), returns the coefficient of determination R^2 of the prediction.
  • set_params(**params), sets the parameters of this estimator.
Input: Output:
from sklearn import linear_model
reg = linear_model.LassoLars(alpha=.1)
reg.fit([[0, 0], [1, 1]], [0, 1])
reg.coef_
LassoLars(alpha=0.1, copy_X=True, eps=..., fit_intercept=True, fit_path=True, max_iter=500, normalize=True, positive=False, precompute='auto', verbose=False)
array([ 0.717157..., 0. ])


References

2017A

Lasso model fit with Least Angle Regression a.k.a. Lars
It is a Linear Model trained with an L1 prior as regularizer.
The optimization objective for Lasso is:
(1 / (2 * n_samples)) * ||y - Xw||^2_2 + alpha * ||w||_1

2017B

(...)
The algorithm is similar to forward stepwise regression, but instead of including variables at each step, the estimated parameters are increased in a direction equiangular to each one’s correlations with the residual.
Instead of giving a vector result, the LARS solution consists of a curve denoting the solution for each value of the L1 norm of the parameter vector. The full coefficients path is stored in the array coef_path_, which has size (n_features, max_features+1). The first column is always zero.