Leave-One-Out Cross-Validation (LOOCV) Algorithm
A Leave-One-Out Cross-Validation (LOOCV) Algorithm is a Leave-K-Out Cross-Validation Algorithm where each cross-validation fold includes only one test item.
- Context:
- It can (typically) be a Time-Consuming Algorithm.
- It can (typically) achieve a High Variance.
- …
- Example(s):
- …
- Counter-Example(s):
- See: Cross-Validation Algorithm, Jackknife Resampling.
References
2021
- (Wikipedia, 2021) ⇒ https://en.wikipedia.org/wiki/Cross-validation_(statistics)#Leave-one-out_cross-validation Retrieved:2021-4-15.
- Leave-one-out cross-validation (LOOCV) is a particular case of leave-p-out cross-validation with p = 1.The process looks similar to jackknife; however, with cross-validation one computes a statistic on the left-out sample(s), while with jackknifing one computes a statistic from the kept samples only.
LOO cross-validation requires less computation time than LpO cross-validation because there are only [math]\displaystyle{ C^n_1=n }[/math] passes rather than [math]\displaystyle{ C^n_p }[/math] . However, [math]\displaystyle{ n }[/math] passes may still require quite a large computation time, in which case other approaches such as k-fold cross validation may be more appropriate.
Pseudo-Code-Algorithm:
Input:
x, {vector of length N with x-values of incoming points}
y, {vector of length N with y-values of the expected result}
interpolate( x_in, y_in, x_out ), { returns the estimation for point x_out after the model is trained with x_in-y_in pairs}
Output:
err, estimate for the prediction error}
Steps:
- Leave-one-out cross-validation (LOOCV) is a particular case of leave-p-out cross-validation with p = 1.The process looks similar to jackknife; however, with cross-validation one computes a statistic on the left-out sample(s), while with jackknifing one computes a statistic from the kept samples only.
err ← 0 for i ← 1, ..., N do // define the cross-validation subsets x_in ← (x[1], ..., x[i − 1], x[i + 1], ..., x[N]) y_in ← (y[1], ..., y[i − 1], y[i + 1], ..., y[N]) x_out ← x[i] y_out ← interpolate(x_in, y_in, x_out) err ← err + (y[i] − y_out)^2 end for err ← err/N
2020
- https://towardsdatascience.com/understanding-8-types-of-cross-validation-80c935a4976d
- QUOTE: ... 2. Leave-one-out cross-validation:
Leave-one-out cross-validation (LOOCV) is an exhaustive cross-validation technique. It is a category of LpOCV with the case of p=1.
(Source), LOOCV operations - For a dataset having n rows, 1st row is selected for validation, and the rest (n-1) rows are used to train the model. For the next iteration, the 2nd row is selected for validation and rest to train the model. Similarly, the process is repeated until n steps or the desired number of operations.
Both the above two cross-validation techniques are the types of exhaustive cross-validation. Exhaustive cross-validation methods are cross-validation methods that learn and test in all possible ways. They have the same pros and cons discussed below:
- Pros:
- Simple, easy to understand, and implement.
- Cons:
- The model may lead to a low bias.
- The computation time required is high.
- QUOTE: ... 2. Leave-one-out cross-validation: