Question: Working with weighted data. Boosting requires the weak learning algorithm to work with training data in which each point ( x ( i ) ,

Working with weighted data. Boosting requires the weak learning algorithm to work with training data
in which each point (x(i),y(i)) has a positive weight i>0. Intuitively, a weight of two would be
equivalent to having two copies of that data point.
How would you incorporate weights into the following learning algorithms without explicitly making
copies of data points?
(a) Decision trees.
Hint: Previously, we measured the uncertainty at any node by looking at the fraction of points
with each label (call these p1,dots,pk if there are k labels). Now we need to compute these fractions
differently, taking weights i into account.
(b) Gaussian generative models.
Hint: For each class j, we need the weight of that class (j), the mean (j), and the covariance
matrix (j). Now we need to compute them differently, taking weights i into account.
(c) Support vector machines.
Hint: Can you incorporate the weights i into the objective function for soft-margin SVM?
Give a brief explanation in each case.Original Data: Boosting Round 1: Boosting Round 2: Boosting Round 3 : (a) Training records chosen during boosting. (b) Weights of training records. Also, we know that the split points for the decision stump are \(0.75,0.05\), and 0.3 for Round 1,2, and 3, respectively. Calculate the Adaboost training error rate and the importance of base llassifier \( C_{1}, C_{2}\) and \( C_{3}\)(Hint: See Slide 13)(3 points)
 Working with weighted data. Boosting requires the weak learning algorithm to

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!