Question: AdaBoost ( Adaptive Boosting ) is another approach to the ensemble method field. It always uses the entire data and features ( unlike before )

AdaBoost (Adaptive Boosting) is another approach to the ensemble method field.
It always uses the entire data and features (unlike before) and aims to create T weighted
classifiers (unlike before, where each classifier had same influence). The new
classification will be decided by linear combination of all the classifiers, by:
g(x)=sign(t=1Ttft(x)),t0
Consider the following dataset in R2 :
The first decision stump is already drawn, the arrow
points in the positive direction. Calculate the
classifier error (1) and weight (1).
Calculate the new weights of the samples (and
normalize them to get valid distribution).
Draw the second decision stump. Reminder: the decision stump (our classifiers)
are parallel to xy axis.
Without calculations, which classifier's weight is larger, 1 or 2? Explain why.
In the right image, there is the dataset and the weights for each point, after finding
the third decision stump and calculating the new weights. Which of the following
(green or blue) is the correct third decision stump?
Given 2=1.1,3=0.62, draw the full
classifier, like in slide 13.
What is the train accuracy?
 AdaBoost (Adaptive Boosting) is another approach to the ensemble method field.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!