Residual Measure
Jump to navigation
Jump to search
A Residual Measure is a deviation measure of a numerical approximation from the exact solution (observed value or theoretical value).
- AKA: Regression Residual, Fitting Deviation, Numerical Residual.
- Context:
- It can be defined as [math]\displaystyle{ r=b-f(x_0) }[/math] for an numerical approximation [math]\displaystyle{ x_0 }[/math] of [math]\displaystyle{ x\lt math\gt where f(x)=b }[/math].
- It [math]\displaystyle{ r(x)=\max_{x\in \mathcal X} |g(x)-T(f_{\rm A})(x)| }[/math] where approximation [math]\displaystyle{ f_{A}~ }[/math] of the solution of [math]\displaystyle{ ~f~ }[/math] of the equation [math]\displaystyle{ T(f)(x)=g(x) }[/math].
- It can be defined as [math]\displaystyle{ r_i=X_i-\overline{X} }[/math] where [math]\displaystyle{ X }[/math] is a sample of random variables [math]\displaystyle{ X_i=\{X_1, \dots, X_n\} }[/math] and [math]\displaystyle{ \overline{X}. }[/math] is the sample mean.
See: Regression Algorithm, Least Squares Estimation Algorithm, Kernel Function, Studentized Residual, Numerical Approximation.
References
2016
- (Wikipedia, 2016) ⇒ http://en.wikipedia.org/wiki/Residual_(numerical_analysis) Retrieved 2016-08-07
- Loosely speaking, a residual is the error in a result. To be precise, suppose we want to find x such that [math]\displaystyle{ f(x)=b.\, }[/math]. Given an approximation x0 of x, the residual is [math]\displaystyle{ b - f(x_0)\, }[/math] whereas the error is [math]\displaystyle{ x - x_0\, }[/math]
- If we do not know x exactly, we cannot compute the error but we can compute the residual. (...) Similar terminology is used dealing with differential, integral and functional equations. For the approximation [math]\displaystyle{ ~f_{\rm a}~ }[/math] of the solution [math]\displaystyle{ ~f~ }[/math] of the equation [math]\displaystyle{ T(f)(x)=g(x) }[/math], the residual can either be the function [math]\displaystyle{ ~g(x)~ - ~T(f_{\rm a})(x) }[/math] or can be said to be the maximum of the norm of this difference [math]\displaystyle{ \max_{x\in \mathcal X} |g(x)-T(f_{\rm a})(x)| }[/math] over the domain [math]\displaystyle{ \mathcal X }[/math], where the function [math]\displaystyle{ ~f_{\rm a}~ }[/math] is expected to approximate the solution [math]\displaystyle{ ~f~ }[/math], or some integral of a function of the difference, for example [math]\displaystyle{ ~\int_{\mathcal X} |g(x)-T(f_{\rm a})(x)|^2~{\rm d} x. }[/math] In many cases, the smallness of the residual means that the approximation is close to the solution, i.e., [math]\displaystyle{ ~\left|\frac{f_{\rm a}(x) - f(x)}{f(x)}\right| \ll 1.~ }[/math] In these cases, the initial equation is considered as well-posed; and the residual can be considered as a measure of deviation of the approximation from the exact solution.
- (Wikipedia, 2016) ⇒ http://en.wikipedia.org/wiki/Errors_and_residuals_in_statistics Retrieved 2016-08-07
- In statistics and optimization, errors and residuals are two closely related and easily confused measures of the deviation of an observed value of an element of a statistical sample from its "theoretical value". The error (or disturbance) of an observed value is the deviation of the observed value from the (unobservable) true value of a quantity of interest (for example, a population mean), and the residual of an observed value is the difference between the observed value and the estimated value of the quantity of interest (for example, a sample mean). The distinction is most important in regression analysis, where the concepts are sometimes called the regression errors and regression residuals and where they lead to the concept of studentized residuals (...) If we assume a normally distributed population with mean [math]\displaystyle{ \mu }[/math] and standard deviation [math]\displaystyle{ \sigma }[/math], and choose individuals independently, then we have [math]\displaystyle{ X_1, \dots, X_n\sim N(\mu,\sigma^2)\, }[/math] and the sample mean
- [math]\displaystyle{ \overline{X}={X_1 + \cdots + X_n \over n} }[/math]
- is a random variable distributed thus:
- [math]\displaystyle{ \overline{X}\sim N(\mu, \sigma^2/n). }[/math]
- The statistical errors are then [math]\displaystyle{ e_i=X_i-\mu,\, }[/math] whereas the residuals are [math]\displaystyle{ r_i=X_i-\overline{X}. }[/math]
- The sum of squares of the statistical errors, divided by σ2, has a chi-squared distribution with n degrees of freedom:
- [math]\displaystyle{ \frac 1 {\sigma^2}\sum_{i=1}^n e_i^2\sim\chi^2_n. }[/math]
- This quantity, however, is not observable. The sum of squares of the residuals, on the other hand, is observable (...) In regression analysis, the distinction between errors and residuals is subtle and important, and leads to the concept of studentized residuals. Given an unobservable function that relates the independent variable to the dependent variable – say, a line – the deviations of the dependent variable observations from this function are the unobservable errors. If one runs a regression on some data, then the deviations of the dependent variable observations from the fitted function are the residuals.
- (Wikipedia, 2016) ⇒ http://en.wikipedia.org/wiki/Residual Retrieved 2016-08-07
- A residual is generally a quantity left over at the end of a process. It may refer to ...
1997
- (Mitchell, 1997) ⇒ Tom M. Mitchell. (1997). “Machine Learning." McGraw-Hill.
- Much of the literature on nearest-neighbor methods and weighted local regression uses a terminology that has arisen from the field of statistical pattern recognition....
- Regression means approximating a real-valued target function.
- Residual is the error f^(x) - [math]\displaystyle{ f }[/math](x) in approximating the target function.
- Kernel function is the function of distance that is used to determine the wight of each training example. In other words, the kernel function is the function [math]\displaystyle{ K }[/math] such that wi = K(d(xi, xq)).
- Much of the literature on nearest-neighbor methods and weighted local regression uses a terminology that has arisen from the field of statistical pattern recognition....