Question: Consider the more general version of theminimum - normproblem, also referred to as thelinearly constrainedleast squaresproblem ( or just constrained least squares problem ) minimize
Consider the more general version of theminimumnormproblem, also referred to as thelinearly constrainedleast squaresproblem or just constrained least squares problemminimize Axbsubject toCxdHerex, the variable to be found, is annvector. The problem data which are given are themtimes nmatrixA,themvectorb, theptimes nmatrixC, and thepvectord, andCxdrepresentplinear equality constraints.a Derive the optimality conditions using Lagrange multipliers and write them in a matrix vector form.b Show that the optimal point is a unique minimum See section in Boyds bookc Generate a random times matrixAand a random times matrixC. Then generate random vectorsbanddof appropriate dimensions for the constrained least squares problem and compute the solutionxby form and solvingKKTequations usingQRfactorization, using Matlab or Python.d Use Matlabsfminconor Pythonsscipyoptimize.minimizeto verify your solution. See thedocumentation on how to use these functions.eApproximate Solutions using Penalty Method Form a new augmented objective function Axbrho Cxdand solve the problem as constrained optimization problem forrho everytime using the solution of previousrho for the next run for the initial guessxusing i Gradient Descentalgorithm ii Newtons Method. You can use fixed step size or backtracking line search and comparewith the results obtained in the earlier parts. You will need to use some seeding method so the randomnumbers are used for matrices when comparing results, or just pick some fixed numbers for the matrices.The gradients and hessians have to be calculated for the augmented cost function.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
