Jackknife Regression Algorithm
A Jackknife Regression Algorithm is a regression algorithm that uses Jackknife resampling.
- …
- Counter-Example(s):
- See: Jackknifing Algorithm, Resampling (Statistics), Bias of an Estimator, Bootstrap (Statistics).
References
2016
- (Wikipedia, 2016) ⇒ http://wikipedia.org/wiki/Jackknife_resampling Retrieved:2016-3-2.
- In statistics, the jackknife is a resampling technique especially useful for variance and bias estimation. The jackknife predates other common resampling methods such as the bootstrap. The jackknife estimator of a parameter is found by systematically leaving out each observation from a dataset and calculating the estimate and then finding the average of these calculations. Given a sample of size [math]\displaystyle{ N }[/math], the jackknife estimate is found by aggregating the estimates of each [math]\displaystyle{ N-1 }[/math] estimate in the sample.
The jackknife technique was developed by Maurice Quenouille (1949, 1956). John Tukey (1958) expanded on the technique and proposed the name "jackknife" since, like a Boy Scout's jackknife, it is a "rough and ready" tool that can solve a variety of problems even though specific problems may be more efficiently solved with a purpose-designed tool. The jackknife is a linear approximation of the bootstrap.
- In statistics, the jackknife is a resampling technique especially useful for variance and bias estimation. The jackknife predates other common resampling methods such as the bootstrap. The jackknife estimator of a parameter is found by systematically leaving out each observation from a dataset and calculating the estimate and then finding the average of these calculations. Given a sample of size [math]\displaystyle{ N }[/math], the jackknife estimate is found by aggregating the estimates of each [math]\displaystyle{ N-1 }[/math] estimate in the sample.
1986
- (Jeff, 1986) ⇒ Chien-Fu Wu Jeff. (1986). “Jackknife, Bootstrap and Other Resampling Methods in Regression Analysis.” In: the Annals of Statistics
- ABSTRACT: Motivated by a representation for the least squares estimator, we propose a class of weighted jackknife variance estimators for the least squares estimator by deleting any fixed number of observations at a time. They are unbiased for homoscedastic errors and a special case, the delete-one jackknife, is almost unbiased for heteroscedastic errors. The method is extended to cover nonlinear parameters, regression M-estimators, nonlinear regression and generalized linear models.
Interval estimators can be constructed from the jackknife histogram. Three bootstrap methods are considered. Two are shown to give biased variance estimators and one does not have the bias-robustness property enjoyed by the weighted delete-one jackknife. A general method for resampling residuals is proposed. It gives variance estimators that are bias-robust. Several bias-reducing estimators are proposed. Some simulation results are reported.