About
Regularization refers to a process of introducing additional information in order to:
- solve an ill-posed problem
- or to prevent overfitting.
This information is usually of the form of a penalty for complexity, such as restrictions for smoothness or bounds on the vector space norm.
Articles Related
Techniques
Least Square
The least-squares method can be viewed as a very simple form of regularization.
Linear Regression
In statistics and machine learning, regularization methods are used for model selection, in particular to prevent overfitting by penalizing models with extreme parameter values. The most common variants in machine learning are L₁ and L₂ regularization.
When applied in linear regression, the resulting models are termed ridge regression or lasso.
Statistics - (Shrinkage|Regularization) of Regression Coefficients
Regularization is also employed in:
- (binary and multiclass) logistic regression,
Bayes
From a Bayesian point of view, many regularization techniques correspond to imposing certain prior distributions on model parameters.