Question: Optimization -- Itertative Methods for Unconstrained Optimization -- This is Newton's Method, before steepest descent. Show that the direction from x(k)tox(k+1),i.e.p(k)=-[gradg(x(k))]-1g(x(k)) is a descent direction
Optimization -- Itertative Methods for Unconstrained Optimization -- This is Newton's Method, before steepest descent. Show that the direction from x(k)tox(k+1),i.e.p(k)=-[gradg(x(k))]-1g(x(k)) is a descent direction for G(x).Let g:RnRn. One way of solving the system g(x)=0isto minimizeG(x)=12||g(x)||22=12g(x)*g(x).It can be easily shown that the gradient ofG(x)is(gradg(x))Tg(x) and, whenneglecting second partials ofgi,1in, the Hessian ofG(x)is approxi-mately (gradg(x))Tgradg(x). Thus, we have the following Newton-like iteration:(gradg(x(k)))Tgradg(x(k))(x(k+1)-x(k))=-(gradg(x(k)))Tg(x(k)),k=0,1,2,dotsAssume that at some k,gradg(x(k))is nonsingular and g(x(k))0. Show thatthe direction from x(k)tox(k+1),i.e.p(k)=-[gradg(x(k))]-1g(x(k))is a descent direction for G(x).
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
