Linear Least-Squares Regression System
A Linear Least-Squares Regression System is a linear regression system that is a least-squares regression system which implements a linear least-squares algorithm to solve a linear least-squares regression task.
- AKA: Linear Least-Squares Optimizer/Estimator.
- Context:
- It can range from being an Simple Linear Least-Squares Regression System to being a Multivariate Least-Squares Regression System, depending on the number of numeric predictors.
- It can range from being an Ordinary Linear Least-Squares Regression System to being a Regularized Linear Least-Squares Regression System.
- It can solve Linear Least-Squares Regression for Bounds on the Variables Tasks.
- It can solve Linear Least-Squares Regression Task with Non-Negativity Constraints Tasks.
- Example(s):
sklearn.linear_model
[1], a GLM system that implements:statsmodels.api
[2], which implements:statsmodels.api.OLS()
an ordinary least squares for i.i.d. errors . Examples: [3]statsmodels.api.GLS()
a generalized least squares for arbitrary covariance. Examples: [4]statsmodels.api.WLS()
: A weighted least squares for heteroskedastic errors. Examples: [5]
numpy.linalg.lstsq
[6] a Numpy mode that solves the equation a x = b by computing a vector x that minimizes the Euclidean 2-norm || b - a x ||.Examples:
scipy.optimize.lsq_linear
A Scipy mode that solves a linear least-squares problem with bounds on the variables.scipy.optimize.nnls
, a Scipy mode that solves a linear least squares with non-negativity constraint.scipy.sparse.linalg.lsmr
, a Scipy mode that is an iterative solver for least-squares problems.
- …
- Counter-Examples:
- See: OLS System.
References
2017a
- (Scikit Learn, 2017) ⇒ http://scikit-learn.org/stable/modules/classes.html#module-sklearn.linear_model Retrieved: 2017-30-07.
- QUOTE: The
sklearn.linear_model
module implements generalized linear models. It includes Ridge regression, Bayesian Regression, Lasso and Elastic Net estimators computed with Least Angle Regression and coordinate descent. It also implements Stochastic Gradient Descent related algorithms.
- QUOTE: The
2017b
- (Scikit Learn, 2017) ⇒ http://scikit-learn.org/stable/modules/linear_model.html Retrieved: 2017-30-07.
- QUOTE: The following are a set of methods intended for regression in which the target value is expected to be a linear combination of the input variables. In mathematical notion, if [math]\displaystyle{ \hat{y} }[/math] is the predicted value.
[math]\displaystyle{ \hat{y}(w, x) = w_0 + w_1 x_1 + \cdots + w_p x_p }[/math]
Across the module, we designate the vector [math]\displaystyle{ w = (w_1,\cdots, w_p) }[/math] as
coef_
and [math]\displaystyle{ w_0 }[/math] asintercept_
.
- QUOTE: The following are a set of methods intended for regression in which the target value is expected to be a linear combination of the input variables. In mathematical notion, if [math]\displaystyle{ \hat{y} }[/math] is the predicted value.
2017c
- (Scipy, 2017) ⇒ The Scipy community (2008-2009). “numpy.linalg.lstsq" https://docs.scipy.org/doc/numpy/reference/generated/numpy.linalg.lstsq.html Last updated on Jun 10, 2017
numpy.linalg.lstsq(a, b, rcond=-1)
Return the least-squares solution to a linear matrix equation.
Solves the equation a x = b by computing a vector x that minimizes the Euclidean 2-norm || b - a x ||^2. The equation may be under-, well-, or over- determined (i.e., the number of linearly independent rows of a can be less than, equal to, or greater than its number of linearly independent columns). If a is square and of full rank, then x (but for round-off error) is the “exact” solution of the equation (...)
2017d
- (Scipy, 2017) ⇒ The Scipy community (2008-2009). “scipy.optimize.lsq_linear" https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.lsq_linear.html Last updated on Jun 10, 2017
scipy.optimize.lsq_linear(A, b, bounds=(-inf, inf), method='trf', tol=1e-10, lsq_solver=None, lsmr_tol=None, max_iter=None, verbose=0)
Solve a linear least-squares problem with bounds on the variables.
Given a m-by-n design matrix A and a target vector b with m elements, lsq_linear solves the following optimization problem:
minimize 0.5 * ||A x - b||**2
subject to lb <= x <= ub
This optimization problem is convex, hence a found minimum (if iterations have converged) is guaranteed to be global(...)
2017 e.
- (Scipy, 2017) ⇒ The Scipy community (2008-2009). “scipy.optimize.nnls" https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.nnls.html Last updated on Jun 10, 2017
scipy.optimize.nnls(A, b)
Solve argmin_x || Ax - b ||_2 for x>=0. This is a wrapper for a FORTRAN non-negative least squares solver (...)
2017F
- (Scipy, 2017) ⇒ The Scipy community (2008-2009). “scipy.sparse.linalg.lsmr" https://docs.scipy.org/doc/scipy/reference/generated/scipy.sparse.linalg.lsmr.html Last updated on Jun 10, 2017
numpy.linalg.lstsq(a, b, rcond=-1)
Return the least-squares solution to a linear matrix equation.
Solves the equation a x = b by computing a vector x that minimizes the Euclidean 2-norm || b - a x ||^2. The equation may be under-, well-, or over- determined (i.e., the number of linearly independent rows of a can be less than, equal to, or greater than its number of linearly independent columns). If a is square and of full rank, then x (but for round-off error) is the “exact” solution of the equation.
2014
- (Perktold et al.,2014) ⇒ Josef Perktold, Skipper Seabold and Jonathan Taylor (statsmodels-developers, 2009-2017). “Linear Regression" http://statsmodels.sourceforge.net/stable/regression.html
- QUOTE: Linear models with independently and identically distributed errors, and for errors with heteroscedasticity or autocorrelation. This module allows estimation by ordinary least squares (OLS), weighted least squares (WLS), generalized least squares (GLS), and feasible generalized least squares with autocorrelated AR(p) errors.
2012
- http://stat.ethz.ch/R-manual/R-patched/library/stats/html/lm.html
- QUOTE: lm is used to fit linear models.