2. Overview
Adative boosting base on three ideas:
1. Aggregation multiple weak learners
2. For each weak learner have different weight
3. Resample
Advantage
1. Reduce Bias
2. Models for Low Variance & High Bias
2
4. Example of AdaBoosting
0 1 2 3 4 5 6 7 8 9
The training data
4
For example , there is one dimension 10 points, some of show others show
We want to use AdaBoosting to classify those 10 points.
5. Running the algorithm(1)
5
0 1 2 3 4 5 6 7 8 9 10
The training data
t=1
(t means iter)
If x < 2.5 then I = 1 else I = -1
And if f(x) > 0 will classify to
else f(x) < 0 will classify to
8. Running the algorithm(4)
8
0 2 4 6 8 10
The training data
t=3 t=2t=1
For example : we want to decide point 4 is or ,
we need to follow
f3(x) = 0.423649 * (-1) + 0.6496 * (-1) + 0.752 * 1
= -0.321249
So, point 4 will classify to
9. Benifit
1. Adaboost is a high precise classifier
2. We can use many models to build subclassifier
3. When using simple classifier the calculated result is easy to understand
4. No need to do feature selection
5. No need to worry about overfitting
9