Regression is a statistical analysis used:
- based on scores of:
- one predictor variable: simple regression
- or multiple predictor variables: multiple regression
Regression analysis is a statistical process (Supervised function) for:
- estimating the relationships among variables.
Regression analysis helps one understand how the typical value of a (outcome|dependent) variable changes when any one of the (predictor|independent) variables is varied, while the other (predictor|independent) variables are held fixed. (ie which among the independent variables are related to the dependent variable)
The term “regression” was coined by Francis Galton in the nineteenth century to describe a biological phenomenon. The phenomenon was that the heights of descendants of tall ancestors tend to regress down towards a normal average (a phenomenon also known as regression toward the mean). For Galton, regression had only this biological meaning, but his work was later extended by Udny Yule and Karl Pearson to a more general statistical context.
“Regression” comes historically from the idea of regression towards the mean, which is a concept which was discussed in the early 1900s. But we have to live with this term because it's become time honored and you can change this term by model.
- Given demographic and purchasing data about a set of customers, predict customers' age
- Customer lifetime value, house value, process yield rates
Many (techniques|methods) for carrying out regression analysis have been developed.
|Data Mining - (Global) Polynomial Regression (Degree)||Yes|
|Statistics - Standard Least Squares Fit (Gaussian linear model)|
|ordinary least squares regression||Yes||The earliest form of regression which was published by Legendre in 1805 and by Gauss in 1809.|
|Multiple Regression (GLM)|
|Support Vector Machine (SVM)|
|LeastMedSq||LeastMedSq gives an accurate regression line even when there are outliers. However, it is computationally very expensive. In practical situations it is common to delete outliers manually and then use LinearRegression.|
How to improve the model ?
The goal is to produce better models so we can generate more accurate predictions
We can improve a model by:
- Adding more predictor variables (but it will add overfitting and variance.)
- Selecting feature. When we want to predict better, we'll shrink, or regularize, or select features in order to improve the prediction.
When we're doing regression, we're more engaging in inferential statistics and we're going to look at this statistics:
- p value (in order to make probabilities judgement)
in order to know if the results from this sample is going to generalize to other samples.
We want to know if it's possible to make an inference from this sample data to a more general population.
The lm function