Two-Stage Least Squares
(Redirected from 2SLS)
Jump to navigation
Jump to search
A Two-Stage Least Squares is a multivariate regression algorithm that can be used for estimating simultaneous equations.
- AKA: 2SLS, TSLS.
- Example(s):
- Counter-Example(s)
- See: Multivariate Regression Algorithm, Estimator, Instrumental Variable.
References
2015
- (Wikipedia, 2016) ⇒ https://www.wikiwand.com/en/Instrumental_variable#/Interpretation_as_two-stage_least_squares
- One computational method which can be used to calculate IV estimates is two-stage least-squares (2SLS or TSLS). In the first stage, each explanatory variable that is an endogenous covariate in the equation of interest is regressed on all of the exogenous variables in the model, including both exogenous covariates in the equation of interest and the excluded instruments. The predicted values from these regressions are obtained.
- Stage 1: Regress each column of X on Z, ([math]\displaystyle{ X = Z \delta + \text{errors} }[/math])
- [math]\displaystyle{ \widehat{\delta}=(Z^\mathrm{T} Z)^{-1}Z^\mathrm{T}X, \, }[/math]
- and save the predicted values:
- [math]\displaystyle{ \widehat{X}= Z\widehat{\delta} = Z(Z^\mathrm{T} Z)^{-1}Z^\mathrm{T}X = P_Z X.\, }[/math]
- In the second stage, the regression of interest is estimated as usual, except that in this stage each endogenous covariate is replaced with the predicted values from the first stage.
- Stage 2: Regress Y on the predicted values from the first stage:
- In the second stage, the regression of interest is estimated as usual, except that in this stage each endogenous covariate is replaced with the predicted values from the first stage.
- [math]\displaystyle{ Y = \widehat X \beta + \mathrm{noise}.\, }[/math]
- Which gives:
- [math]\displaystyle{ \beta_{2SLS} = \left(X^\mathrm{T}P_Z X\right)^{-1} X^\mathrm{T}P_ZY }[/math]
- Note that the usual OLS estimator is: [math]\displaystyle{ (\widehat X^\mathrm{T}\widehat X)^{-1}\widehat X^\mathrm{T}Y }[/math].
- Replacing [math]\displaystyle{ \widehat X = P_Z X }[/math] and noting that [math]\displaystyle{ P_Z }[/math] is a symmetric and idempotent matrix, so that [math]\displaystyle{ P_Z^\mathrm{T}P_Z=P_Z P_Z = P_Z }[/math]
- [math]\displaystyle{ \beta_{2SLS} = (\widehat X^\mathrm{T}\widehat X)^{-1}\widehat X^\mathrm{T} Y = \left(X^\mathrm{T}P_Z^\mathrm{T}P_Z X\right)^{-1} X^\mathrm{T}P_Z^\mathrm{T}Y=\left(X^\mathrm{T}P_Z X\right)^{-1} X^\mathrm{T}P_ZY }[/math]
- The resulting estimator of [math]\displaystyle{ \beta }[/math] is numerically identical to the expression displayed above. A small correction must be made to the sum-of-squared residuals in the second-stage fitted model in order that the covariance matrix of [math]\displaystyle{ \beta }[/math] is calculated correctly.
- (Wikipedia, 2016) ⇒ https://www.wikiwand.com/en/Simultaneous_equations_model#/Two-stages_least_squares_(2SLS)
- The simplest and the most common estimation method for the simultaneous equations model is the so-called two-stage least squares method, developed independently by Theil and Basmann (1957). It is an equation-by-equation technique, where the endogenous regressors on the right-hand side of each equation are being instrumented with the regressors X from all other equations. The method is called “two-stage” because it conducts estimation in two steps:
- Step 1: Regress Y−i on X and obtain the predicted values [math]\displaystyle{ \scriptstyle\hat{Y}_{\!-i} }[/math];
- Step 2: Estimate γi, βi by the ordinary least squares regression of yi on [math]\displaystyle{ \scriptstyle\hat{Y}_{\!-i} }[/math] and Xi.
- If the ith equation in the model is written as
- [math]\displaystyle{
y_i = \begin{pmatrix}Y_{-i} & X_i\end{pmatrix}\begin{pmatrix}\gamma_i\\\beta_i\end{pmatrix} + u_i
\equiv Z_i \delta_i + u_i,
}[/math]
- where Zi is a T×(ni + ki) matrix of both endogenous and exogenous regressors in the ith equation, and δi is an (ni + ki)-dimensional vector of regression coefficients, then the 2SLS estimator of δi will be given by
- [math]\displaystyle{
\hat\delta_i = \big(\hat{Z}'_i\hat{Z}_i\big)^{-1}\hat{Z}'_i y_i
= \big( Z'_iPZ_i \big)^{-1} Z'_iPy_i,
}[/math]
- where P = X (X ′X)−1X ′ is the projection matrix onto the linear space spanned by the exogenous regressors X.
1995
- (Angrist, 1995) & rArr; Angrist, J. D., & Imbens, G. W. (1995). “Two-stage least squares estimation of average causal effects in models with variable treatment intensity". Journal of the American statistical Association, 90(430), 431-442.[DOI:10.1080/01621459.1995.10476535]
- Two-stage least squares (TSLS) is widely used in econometrics to estimate parameters in systems of linear simultaneous equations and to solve problems of omitted-variables bias in single-equation estimation. We show here that TSLS can also be used to estimate the average causal effect of variable treatments such as drug dosage, hours of exam preparation, cigarette smoking, and years of schooling. The average causal effect in which we are interested is a conditional expectation of the difference between the outcomes of the treated and what these outcomes would have been in the absence of treatment. Given mild regularity assumptions, the probability limit of TSLS is a weighted average of per-unit average causal effects along the length of an appropriately defined causal response function. The weighting function is illustrated in an empirical example based on the relationship between schooling and earnings.
1974
- (Brundy & Jorgenson 1974) ⇒ Brundy, J. M., & Jorgenson, D. W. (1974). “The relative efficiency of instrumental variables estimators of systems of simultaneous equations". In Annals of Economic and Social Measurement, Volume 3, number 4 (pp. 679-700). NBER.
- The method of instrumental variables for estimation of a single equation in a system of simultaneous equations is the following: We suppose that [math]\displaystyle{ r_j }[/math] jointly dependent and [math]\displaystyle{ s_j }[/math] predetermined variables are included in the j-th equation and that a subset of [math]\displaystyle{ t_j = r_j + s_j - 1 [[instrumental variable]]s \lt math\gt W_j }[/math] is selected from the set of [math]\displaystyle{ t }[/math] instrumental variables [math]\displaystyle{ W }[/math]. The instrumental variables estimator [math]\displaystyle{ d_j }[/math] of [math]\displaystyle{ \delta_j }[/math] is obtained by solving the equation:
[math]\displaystyle{ W'_jy_j = W'_jZ_jd_j }[/math]
- obtaining [math]\displaystyle{ d_j=(W'_jZ_j)^{-1}W'_jy_j }[/math]. Examples of instrumental variables are:
- 1. The indirect least squares estimator,
- obtaining [math]\displaystyle{ d_j=(W'_jZ_j)^{-1}W'_jy_j }[/math]. Examples of instrumental variables are:
- [math]\displaystyle{ d_j = (X'Z_j)^{-1}X'_jy_j \quad \textrm{where}\quad t = p = r_j + s_j - 1 }[/math].
- 2. The two-stage least squares estimator:
- [math]\displaystyle{ d_j = (Z'jX_j(X'X)^{-1} X'Zj)^{-1}Z'_jX(X'X)^{-1} X'y_j }[/math]
- where [math]\displaystyle{ W_j = X(X'X)^{- 1} X'Z_j }[/math], the fitted values from a regression of the right-hand-side variables in the equation [math]\displaystyle{ Z_j }[/math] on the matrix of predetermined variables X.
1971
- (Kelejian, 1971) ⇒ Kelejian, H. H. (1971). “Two-stage least squares and econometric systems linear in parameters but nonlinear in the endogenous variables". Journal of the American Statistical Association, 66(334), 373-374. [DOI:10.1080/01621459.1971.1048227]
- It is demonstrated that a variant of the two-stage least squares technique can be used to estimate the parameters of a nonlinear model. To do this, the reduced form equations of such models are derived and discussed; then certain problems particular to the estimation of nonlinear models are considered.