Question: 7.4 Weighted instances. Let the training sample be S = ((x1; y1); : : : ; (xm; ym)). Suppose we wish to penalize di erently
7.4 Weighted instances. Let the training sample be S = ((x1; y1); : : : ; (xm; ym)).
Suppose we wish to penalize dierently errors made on xi versus xj . To do that, we associate some non-negative importance weight wi to each point xi and dene the objective function F() = Pm i=1 wie????yif(xi), where f = PT t=1 tht. Show that this function is convex and dierentiable and use it to derive a boostingtype algorithm.
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
