Question: Q 1 ( 6 0 pts ) : In class, we saw that if our data is not linearly separable, then we need to modify
Q pts: In class, we saw that if our data is not linearly separable, then we need to
modify our support vector machine algorithm by introducing an error margin that must be
minimized. Specifically, the formulation we have looked at is known as the l norm soft
margin SVM
In this problem we will consider an alternative method, known as the l norm soft margin
SVM This new algorithm is given by the following optimization problem notice that the
slack penalties are now squared:
minwbxi wCsumim xi i
subject to
yiwTxibxi iidots,m
Questions:
a Notice that we have dropped the xi i constraint in the l problem. Show that these
nonnegativity constraints can be removed. That is show that the optimal value of
the objective will be the same whether or not these constraints are present. points
b What is the Lagrangian of the l soft margin SVM optimization problem? points
c Minimize the Lagrangian with respect to wb and xi by taking the following gradients:
gradwLdelLdelb and gradxi L and then setting them equal to Here, xi xi xi dots,xi mT
pointsl soft margin SVM optimization problem? points
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
