Question: Q 1 ( 6 0 pts ) : In class, we saw that if our data is not linearly separable, then we need to modify

Q1(60 pts): In class, we saw that if our data is not linearly separable, then we need to
modify our support vector machine algorithm by introducing an error margin that must be
minimized. Specifically, the formulation we have looked at is known as the l_(1) norm soft
margin SVM.
In this problem we will consider an alternative method, known as the l_(2) norm soft margin
SVM. This new algorithm is given by the following optimization problem (notice that the
slack penalties are now squared):
min_(w,b,\xi ),(1)/(2)||w||^(2)+(C)/(2)\sum_(i=1)^m \xi _(i)^(2)
subject to
y^((i))(w^(T)x^((i))+b)>=1-\xi _(i),i=1,dots,m.
Questions:
(a) Notice that we have dropped the \xi _(i)>=0 constraint in the l_(2) problem. Show that these
non-negativity constraints can be removed. That is, show that the optimal value of
the objective will be the same whether or not these constraints are present. [10 points]
(b) What is the Lagrangian of the l_(2) soft margin SVM optimization problem? [10 points]
(c) Minimize the Lagrangian with respect to w,b, and \xi by taking the following gradients:
grad_(w)L,(delL)/(delb), and grad_(\xi )L, and then setting them equal to 0. Here, \xi =[\xi _(1),\xi _(2),dots,\xi _(m)]^(T).
pointsl_(2) soft margin SVM optimization problem? [25 points]

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!