Question: Support Vector Machines We have seen that in p = 2 dimensions, a linear decision boundary takes the form 0 + 1 x 1 +
Support Vector Machines
We have seen that in dimensions, a linear decision boundary takes the form We now investigate a nonlinear decision boundary.
a Point Sketch the curve
b Point On your sketch, indicate the set of points for which and the set of points for which
c Point Suppose that a classifier assigns an observation to the blue class if and to the red class otherwise. To what class are the following observations classified?
d Point Argue that while the decision boundary in c is not linear in terms of and it is linear in terms of and
Provide and explain one case where you would prefer One versus One approach to One vs All for multiclass classification. Provide one example and explain where you would prefer One vs All.
Consider the following optimization problem for Support Vector Classifier given in ISLP book Section This classifier can be used to learn a linear decision boundary between two classes.
subject
dots
a Explain how the variables and are related.
b Explain why the following sentence is true or false: As the value of increases, the bias of the classifier decreases and the variance increases"
c Write down the optimization problem if we want to change the linear decision boundary to a quadratic decision boundary.
d How would you change the support vector classifier such that it can fit arbitrary nonlinear decision boundary? Hint: Read Section ISLP
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
