Total Sum of Squares (TSS) Measure
Jump to navigation
Jump to search
A Total Sum of Squares (TSS) Measure is a sum of squares function that ...
- AKA: SST.
- Context:
- It can (typically) be based on the to sum of Explained Sum of Squares and Residual Sum of Squares.
- See: Sum Squared of Errors.
References
2017
- (Wikipedia, 2017) ⇒ https://en.wikipedia.org/wiki/total_sum_of_squares Retrieved:2017-10-2.
- In statistical data analysis the total sum of squares (TSS or SST) is a quantity that appears as part of a standard way of presenting results of such analyses. It is defined as being the sum, over all observations, of the squared differences of each observation from the overall mean. [1] In statistical linear models, (particularly in standard regression models), the TSS is the sum of the squares of the difference of the dependent variable and its mean: : [math]\displaystyle{ \mathrm{TSS}=\sum_{i=1}^{n}\left(y_{i}-\bar{y}\right)^2 }[/math] where [math]\displaystyle{ \bar{y} }[/math] is the mean. For wide classes of linear models, the total sum of squares equals the explained sum of squares plus the residual sum of squares. For a proof of this in the multivariate OLS case, see partitioning in the general OLS model. In analysis of variance (ANOVA) the total sum of squares is the sum of the so-called "within-samples" sum of squares and "between-samples" sum of squares, i.e., partitioning of the sum of squares. In multivariate analysis of variance (MANOVA) the following equation applies[2] : [math]\displaystyle{ \mathbf{T} = \mathbf{W} + \mathbf{B}, }[/math] where T is the total sum of squares and products (SSP) matrix, W is the within-samples SSP matrix and B is the between-samples SSP matrix.
Similar terminology may also be used in linear discriminant analysis, where W and B are respectively referred to as the within-groups and between-groups SSP matrices.
- In statistical data analysis the total sum of squares (TSS or SST) is a quantity that appears as part of a standard way of presenting results of such analyses. It is defined as being the sum, over all observations, of the squared differences of each observation from the overall mean. [1] In statistical linear models, (particularly in standard regression models), the TSS is the sum of the squares of the difference of the dependent variable and its mean: : [math]\displaystyle{ \mathrm{TSS}=\sum_{i=1}^{n}\left(y_{i}-\bar{y}\right)^2 }[/math] where [math]\displaystyle{ \bar{y} }[/math] is the mean. For wide classes of linear models, the total sum of squares equals the explained sum of squares plus the residual sum of squares. For a proof of this in the multivariate OLS case, see partitioning in the general OLS model. In analysis of variance (ANOVA) the total sum of squares is the sum of the so-called "within-samples" sum of squares and "between-samples" sum of squares, i.e., partitioning of the sum of squares. In multivariate analysis of variance (MANOVA) the following equation applies[2] : [math]\displaystyle{ \mathbf{T} = \mathbf{W} + \mathbf{B}, }[/math] where T is the total sum of squares and products (SSP) matrix, W is the within-samples SSP matrix and B is the between-samples SSP matrix.