Statistics - Residual sum of Squares (RSS) = Squared loss ?
Table of Contents
1 - About
The Residual sum of Squares (RSS) is defined as below and is used in the Least Square Method in order to estimate the regression coefficient.
The smallest residual sum of squares is equivalent to the largest r squared.
The deviance calculation is a generalization of residual sum of squares.
Squared loss = <math>(y-\hat{y})^2</math>
2 - Articles Related
3 - Equation
<MATH> \begin{array}{rrl} \text{Residual sum of Squares (RSS)} & = & \sum_{i=1}^{\href{sample_size}{N}}(\href{residual}{residual})^2 \\ RSS & = & \sum_{i=1}^{\href{sample_size}{N}}(\href{residual}{e_i})^2 \\ RSS & = & \sum_{i=1}^{\href{sample_size}{N}}(Y_i-\hat{Y_i})^2 \\ \end{array} </MATH>
where:
- <math>Y</math> is the original target score
- <math>\hat{Y}</math> is the predicted target score (ie the regression equation, ie the model)
- <math>e</math> is the residual
4 - Example
4.1 - Simple Regression
<MATH> \begin{array}{rrl} RSS & = & \sum_{i=1}^{\href{sample_size}{N}}(Y_i-\hat{B}_0-\hat{B}_1 X_i)^2 \\ \end{array} </MATH>
4.2 - Multiple Regression
<MATH> \begin{array}{rrl} RSS & = & \sum_{i=1}^{\href{sample_size}{N}}(Y_i-\hat{B}_0-\hat{B}_1 X_{i1}-\dots-\hat{B}_n X_{in})^2 \\ & = & \sum_{i=1}^{\href{sample_size}{N}}(Y_i-\hat{B}_0-\sum_{j=1}^{\href{dimension}{P}}\hat{B}_j X_{ij})^2 \\ \end{array} </MATH>