Question: 1 (Subgradient, 25 points total). Consider the following regularized least square problem: min f (x) = 12 Ax b2 x2 , where A Rmn and

1 (Subgradient, 25 points total). Consider the following regularized least square problem: min f (x) = 12 Ax b2 x2 , where A Rmn and > 0. Note that this is not ridge regression as the regularizer is not squared! (a) (10 points) Suppose that A is a columns-wise orthogonal matrix (i.e AA = I). Use the first order optimality condition for non-smooth functions to compute an analytical solution. (b) (15 points) For a general A, design a subgradient algorithm to solve the optimization problem with initialization x0 = 0. That is, choose a proper subgradient direction and a suitable step size such that the proposed method converges to the optimal solution at a sublinear rate O 1/k. (Hint: Use the convergence result for the subgradient method presented in class, and use similar ideas from the previous HW to bound x2 by R where x denotes the optimal solution. You can assume that we are working on a bounded domain within radius R from the origin to compute the Lipschitz constant.)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!