Question: For Questions 2.3 and 2.4 below, let's focus on a particular convex optimization problem, widely known as the soft-margin SVM. (We will discuss more on

For Questions 2.3 and 2.4 below, let's focus on a particular convex optimization problem, widely known as the soft-margin SVM. (We will discuss more on its geometric intuition, but for now, let us just focus on the optimization part. Indeed, many ML algorithms can be formulated as an optimization, where you can simply apply standard tools in optimization 2 to solve it!). wRd,Rnmins.t.w2+i=1niyiwxi1i,i=1,,n,i0,i=1,,n where yi{1,1} and xiRd for i=1,,n are given to us (i.e., we do not need to ask optimization algorithms to search for these parameters), and w=ww is the Euclidean norm of a vector. Question 2.3 (5 points): In (3), the objective function w2+i=1ni is a function of w and . Show that it is a convex function with respect to w and . Hint: first show that w2 and i=1ni are convex, then use the property that the sum of convex functions is convex. Note that there are n constraints of the form yiwxi1i (one for each i=1,,n corresponding to a data point), and another set of n constraints of the form i0 (which simply means that the elements of are nonnegative). The optimization (3) is convex since the objective function is convex and all of the constraints are affine. Question 2.4 (5 points): Write out the Lagrangian function for the optimization problem (3). Pay attention to the fact that the constraints in (3) involves and variables on both sides, so you need to first reformulate them into the form of gi()0 in order to match the standard optimization (1) and use the standard formula of (2). Also note that the optimization variables are wRd and Rn, and xi and yi should be treated as known constants
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
