Question: Consider building an SVM for the following two-class training data: Positive class: [-1 3]^T, [0 2]^T, [0 1]^T, [0 0]^T, Negative class [1 5]^T, [1

Consider building an SVM for the following two-class training data: Positive class: [-1 3]^T, [0 2]^T, [0 1]^T, [0 0]^T, Negative class [1 5]^T, [1 6]^T, [3 3]^T (a) Plot the training points and, by inspection, draw a near classifier that separates the data with maximum margin. (b) This linear SVM is parameterized by h(x) = w^t x + b. write the parameters w and b. (c) Suppose you observe an additional set of points, all from the positive class: More positive points: [-2 0]^T, [-2 1]^T, [-1 0]^T, [-1 1]^T, [0 0]^t What is the linear SVM (in terms of w and b) now? Consider building an SVM for the following two-class training data: Positive class: [-1 3]^T, [0 2]^T, [0 1]^T, [0 0]^T, Negative class [1 5]^T, [1 6]^T, [3 3]^T (a) Plot the training points and, by inspection, draw a near classifier that separates the data with maximum margin. (b) This linear SVM is parameterized by h(x) = w^t x + b. write the parameters w and b. (c) Suppose you observe an additional set of points, all from the positive class: More positive points: [-2 0]^T, [-2 1]^T, [-1 0]^T, [-1 1]^T, [0 0]^t What is the linear SVM (in terms of w and b) now
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
