Question: Consider the following convex optimization problem over matrices: min [F(X) = f(x) + ||X||.] where C = {X Rxd | ||X||F 0 is a

Consider the following convex optimization problem over matrices: min [F(X) = f(x) 

Consider the following convex optimization problem over matrices: min [F(X) = f(x) + \||X||.] where C = {X Rxd | ||X||F 0 is a regularization parameter, and ||X. denotes the nuclear (trace) norm of a matrix X, which is the sum, or equivalently the 1 norm, of the singular values of X. (a) Show that projection over the set C is: II(X) = min {1, M X. (b) What is the subdiffernartiable set OF(X) of the objective function. You might need to read [AW]. (c) Consider the projected subgradient descent algorithm for solving the above optimization problem which iteratively updates the initial solution X = 0 by Xt+1 = Ilc (X-Gt), where G = OF(X) (gradient of f plus subgradient of trace norm from part (b)). Show the convergence rate after T iterations can be bounded by: T E[F(X.)] F(X.) + ! ||X|+G 2=1nt t=1 t=1 where G is an upper bound on the gradient of f(x), i.e.. ||V(X)||2 G (d) Decide an optimal value for learning rate and simply the convergence rate.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!