Bootstrap aggregating (bagging) is a machine learning ensemble meta-algorithm designed to improve the stability and accuracy of machine learning algorithms used in statistical classification and regression.
It also reduces variance and helps to avoid over-fitting. Although it is usually applied to decision tree methods, it can be used with any type of method.
Bagging:
Bagging can be parallelized.
It's very suitable for “unstable” learning schemes which means that small change in training data can make big change in the model.
Example: decision trees is a very unstable schema but not Naïve Bayes or instance‐based learning because all attributes contributes independently
In bagging, you sample the set “with replacement” which means that you might get in your sample two of the same instance.
meta>Bagging