Question: Nonlinear optimization Matlab: gradient method with exact linesearch problem For the functions f_1(x) = (x_17)^2 + (x_22)^2 and f_2(x) = 4x_1^2+x_2^22x_1x_2 a) Explain why the

Nonlinear optimization Matlab: gradient method with exact linesearch problem

For the functions f_1(x) = (x_17)^2 + (x_22)^2 and f_2(x) = 4x_1^2+x_2^22x_1x_2

a) Explain why the gradient method with exact linesearch converges in one iteration for f_1(x) and in 50 iterations for f_2(x)?

In this case, A=[1,0; 0,1] ,b=[-7,-2], c=53, x0=[9;4]

This is the Matlab code for exact linesearch:

function [X,x,fun_val]=gradient_method_quadratic(A,b,x0,epsilon,c); % INPUT % ====================== % A ....... the positive definite matrix associated with the objective function % b ....... a column vector associated with the linear part of the objective function % x0 ...... starting point of the method % epsilon . tolerance parameter % OUTPUT % ======================= % x ....... an optimal solution (up to a tolerance) of min(x^T A x+2 b^T x) % fun_val . the optimal function value up to a tolerance

x=x0; iter=1; grad=2*(A*x+b); fun_val(iter)=x'*A*x+2*b'*x+c; %X(:,(iter+1))=x; while (norm(grad)>epsilon) iter=iter+1; t=norm(grad)^2/(2*grad'*A*grad); x=x-t*grad; X(:,(iter))=x; grad=2*(A*x+b); fun_val(iter)=x'*A*x+2*b'*x+c; fprintf('iter_number = %3d norm_grad = %2.6f fun_val = %2.6f ',iter,norm(grad),fun_val(iter)); end

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!