Sum of Squared Errors (SSE) Measure
(Redirected from Sum Squared of Errors)
Jump to navigation
Jump to search
A Sum of Squared Errors (SSE) Measure is a sum of squares that ...
- AKA: SSR, Residual Sum of Squares (RSS).
- Context:
- It can be a component of a Total Sum of Squares (when summed with an explained sum of squares).
- …
- Counter-Example(s):
- See: Square Operation, Errors and Residuals in Statistics, Optimality Criterion, Model Selection.
References
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/residual_sum_of_squares Retrieved:2015-6-24.
- In statistics, the residual sum of squares (RSS), also known as the sum of squared residuals (SSR) or the sum of squared errors of prediction (SSE), is the sum of the squares of residuals (deviations of predicted from actual empirical values of data). It is a measure of the discrepancy between the data and an estimation model. A small RSS indicates a tight fit of the model to the data. It is used as an optimality criterion in parameter selection and model selection.
In general, total sum of squares = explained sum of squares + residual sum of squares. For a proof of this in the multivariate ordinary least squares (OLS) case, see partitioning in the general OLS model.
- In statistics, the residual sum of squares (RSS), also known as the sum of squared residuals (SSR) or the sum of squared errors of prediction (SSE), is the sum of the squares of residuals (deviations of predicted from actual empirical values of data). It is a measure of the discrepancy between the data and an estimation model. A small RSS indicates a tight fit of the model to the data. It is used as an optimality criterion in parameter selection and model selection.