Question: Problem 4 ( 1 5 points ) Consider the AdaBoost algorithm we discussed in the class ? 1 . AdaBoost is an example of ensemble
Problem points Consider the AdaBoost algorithm we discussed in the class AdaBoost is an example of ensemble classifiers where the weights in next round are decided based on the training error of the weak classifier learned on the current weighted training set. We wish to run the AdaBoost on the dataset provided in Table
tableInstanceColor,Size,Shape,Edible?DYellow,Small,Round,YesDYellow,Small,Round,NoDGreen,Small,Irregular,YesDGreen,Large,Irregular,NoDYellow,Large,Round,YesDYellow,Small,Round,YesDYellow,Small,Round,YesDYellow,Small,Round,YesDGreen,Small,Round,NoDYellow,Large,Round,NoDYellow,Large,Round,YesDYellow,Large,Round,NoDYellow,Large,Round,NoDYellow,Large,Round,NoDYellow,Small,Irregular,YesDYellow,Large,Irregular,Yes
Table : Mushroom data with instances, three categorical features, and binary labels.
a Assume we choose the following decision stump a shallow tree with a single decision node as the first predictor ie when training instances are weighted uniformly:What would be the weight of f in final ensemble classifier ie alpha in :fxsumiKalphaifixb After computing f we proceed to next round of AdaBoost. We begin by recomputing data weights depending on the error of f and whether a point was misclassified by f What is the weight of each instance in second boosting iteration, ie after the points have been reweighted? Please note that the weights across the training set are to be uniformly initialized.c In AdaBoost, would you stop the iteration if the error rate of the current weak classifier on the weighted training data is
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
