Resampling Algorithm
(Redirected from Resampling (statistics))
Jump to navigation
Jump to search
A Resampling Algorithm is an estimation algorithm that is based on population samples.
- AKA: Permutation Test.
- Context:
- It can be applied by a Resampling System (that solves a resampling task)
- Example(s):
- Counter-Example(s):
- See: Estimation Algorithm, Statistical Inference, Coefficient Of Variation, Poisson Sampling, Sample Distribution Estimation Algorithm.
References
2013
- (Wikipedia, 2013) ⇒ http://en.wikipedia.org/wiki/Resampling_(statistics) Retrieved:2013-12-4.
- In statistics, resampling is any of a variety of methods for doing one of the following:
- Estimating the precision of sample statistics (medians, variances, percentiles) by using subsets of available data (jackknifing) or drawing randomly with replacement from a set of data points (bootstrapping)
- Exchanging labels on data points when performing significance tests (permutation tests, also called exact tests, randomization tests, or re-randomization tests)
- Validating models by using random subsets (bootstrapping, cross validation)
- Common resampling techniques include bootstrapping, jackknifing and permutation tests.
- In statistics, resampling is any of a variety of methods for doing one of the following:
2008
- (Sasha & Wilson, 2008) ⇒ Dennis Shasha, and Manda Wilson. (2008). “Statistics is Easy!" Synthesis Lectures on Mathematics and Statistics. doi:10.2200/S00142ED1V01Y200807MAS001
2003
- (Howell, 2003) ⇒ David C. Howell. (2003). http://www.uvm.edu/~dhowell/StatPages/Resampling/
- QUOTE: The tests that I am discussing seem to go under a variety of names, which doesn't make things any easier. The general phrase “resampling tests" applies to any situation in which the test is based on resampling scores from some pool of data. Bootstrapping and randomization tests are both examples of resampling tests. A name that has been around for some time is “permutation tests." It refers to the fact that with randomization tests we permute the data into all sorts of different orders, and then calculate our test statistic on each permutation. The only problem with this name, as I see it, is that we aren't really taking permutations -- we are taking different combinations.
- (Lahiri, 2003) ⇒ Soumendra N. Lahiri. (2003). “Resampling Methods for Dependent Data." Springer
- Book overview: This book gives a detailed account of bootstrap methods and their properties for dependent data, covering a wide range of topics such as block bootstrap methods, bootstrap methods in the frequency domain, resampling methods for long range dependent data, and resampling methods for spatial data. The first five chapters of the book treat the theory and applications of block bootstrap methods at the level of a graduate text. The rest of the book is written as a research monograph, with frequent references to the literature, but mostly at a level accessible to graduate students familiar with basic concepts in statistics. Supplemental background material is added in the discussion of such important issues as second order properties of bootstrap methods, bootstrap under long range dependence, and bootstrap for extremes and heavy tailed dependent data. Further, illustrative numerical examples are given all through the book and issues involving application of the methodology are discussed. The book fills a gap in the literature covering research on resampling methods for dependent data that has witnessed vigorous growth over the last two decades but remains scattered in various statistics and econometrics journals. It can be used as a graduate level text for a special topics course on resampling methods for dependent data and also as a research monograph for statisticians and econometricians who want to learn more about the topic and want to apply the methods in their own research. S.N. Lahiri is a professor of Statistics at the Iowa State University, is a Fellow of the Institute of Mathematical Statistics and a Fellow of the American Statistical Association.
- Keywords_: random variables, variogram, M-estimator, sample mean, Smooth Function, random vectors, autoregressive process, block size, long-range dependence, stationary process, spectral density, sampling distribution, periodogram, conditional distribution, converges in distribution, Lebesgue measure, covariance matrix, probability measures, asymptotic variance, Gaussian process
1995
- (Shao & Tu, 1995) ⇒ Jun Shao and Dongsheng Tu. (1995). “The Jackknife and Bootstrap." Springer-Verlag. ISBN:0387945156
- BOOK OVERVIEW: The jackknife and bootstrap are the most popular data-resampling methods used in statistical analysis. The resampling methods replace theoretical derivations required in applying traditional methods (such as substitution and linearization) in statistical analysis by repeatedly resampling the original data and making inferences from the resamples.
1993
- (Efron & Tibshirani, 1993) ⇒ Bradley Efron, and Robert Tibshirani. (1993). “An Ïntroduction to the Bootstrap." New York: Chapman and Hall.
- Keywords_: standard error, confidence intervals, jackknife estimate, cross-validation, delta method, permutation test, Fisher information, histogram, exponential family, random variable, bootstrap computations, empirical distribution function, nonparametric, null hypothesis, bioequivalence, bootstrap samples, estimate bias, LSAT, standard deviation, importance sampling
1982
- (Efron, 1982) ⇒ Bradley Efron. (1982). “The Jacknife, the Bootstrap and Other Resampling Plans.” CBMS-NSF Regional conference series in applied mathematics
1979
- (Efron, 1979) ⇒ Bradley Efron. (1979). “Bootstrap Methods: Another Look at the Jackknife.” In: The Annals of Statistics, 7(1). http://www.jstor.org/stable/2958830
- http://books.google.com/books?id=-uJ_auimaYkC&pg=PA569
- ABSTRACT: We discuss the following problem: given a random sample X = (X1, X2, ..., Xn) from an unknown probability distribution F, estimate the sampling distribution of some prespecified random variable R(X, F), on the basis of the observed data x. (Standard jackknife theory gives an approximate mean and variance in the case R(X, F) = θ(F^) - θ(F), θ some parameter of interest.) A general method, called the "bootstrap," is introduced, and shown to work satisfactorily on a variety of estimation problems. The jackknife is shown to be a linear approximation method for the bootstrap. The exposition proceeds by a series of examples: variance of the sample median, error rates in a linear discriminant analysis, ratio estimation, estimating regression parameters, etc.