Question: 1. This problem is to be done by hand. In the parts below, you will set up the primal Lagrangian and derive the dual
1. This problem is to be done by hand. In the parts below, you will set up the primal Lagrangian and derive the dual Lagrangian, for support vector machine classifier, for the case of data that is linearly separable in expanded feature space. For the SVM learning, we stated the optimization problem as: Minimize J(w)= s.t., (+)2 i Wo Assume our data is linearly separable in the expanded feature space. Also, nonaugmented notation is used throughout this problem. (a) If the above set of constraints (second line of equations above) is satisfied, will all the training data be correctly classified? (b) Write the Lagrangian function L(w.wo.2) for the minimization problem stated above. Use 2, 1=1, 2, ..., N for the Lagrange multipliers. Also state the KKT conditions. (Hint: there are 3 KKT conditions). (c) Derive the dual Lagrangian L, by proceeding as follows: (i) Minimize L w.r.t. the weights. Hint: solve VL=0 for the optimal weight w* (in terms of 2, and other variables); L and set -= 0 and simplify. dwo (ii) Substitute your expressions from part (i) into L, and use your expression from L -=0 as a new constraint, to derive LD as: awo NN N LD 0(4) = 4[ ^^;FF ] + 2, - i=1 jel i=1 N subject to the (new) constraint: ^, =0, which becomes a new KKT condition. Also give the other two KKT conditions on A,, which carry over from the primal form.
Step by Step Solution
There are 3 Steps involved in it
a If the set of constraints is satisfied ie yiwx wo 1 for all training examples xi yi where yi is th... View full answer
Get step-by-step solutions from verified subject matter experts
