Question: 1. Support Vector Regression Support vector regression (SVR) is a method for regression analogous to the support vector classifier. Let (xi, yi) ERd x R,

 1. Support Vector Regression Support vector regression (SVR) is a method

1. Support Vector Regression Support vector regression (SVR) is a method for regression analogous to the support vector classifier. Let (xi, yi) ERd x R, i = 1, . . ., n be training data for a regression problem. In the case of linear regression, SVR solves min w,b,+ ,5 7 1 1 20/ 13 + - [(+ + 57) s.t. wxitb - yi 0, e > 0 are fixed, and || . |/2 is the Euclidean norm. a. (5pts) Show that for an appropriate choice of 1, SVR solves min w,b [ le(yi, w xi + 6) + Allw /13 where le(y, t) = max{0, ly - t) - e} is the so-called e-insensitive loss, which does not penalize prediction errors below a level of c. Note: This part does not play a role in the subsequent parts. b. (5 pts) The optimization problem is convex with affine constraints, and therefore strong duality holds. Use the KKT conditions to derive the dual optimization problem in a manner analogous to the support vector classifier. As in the SVC, you should eliminate the dual variables corresponding to the constraints S, 2 0, S; 2 0. c. (3 pts) Explain how to kernelize SVR. Be sure to explain how to determine b* and evaluate the final predicton function. d. (2 pts) Argue that the final predictor will only depend on a subset of training examples, and characterize those training examples. Your characterization should be analogous to the characterization of support vectors being "on the wrong side of the margin" in support vector classifiers

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!