Question: Hi please give a Matlab code such that it addresses ALL aspects of Question 3 using only Newton Second - Order Method and need to
Hi please give a Matlab code such that it addresses ALL aspects of Question using only Newton SecondOrder Method and need to determine a good estimates of the starting points in order for newton method to work My idea is to use gradient descent to determine a good estimate of starting points then pass to newton method to converge? I kept getting wrong figures and distances for my for my code. Please help.
function mindistance, tmin, tmin Distancer e e a b n r e e a b n
Define tolerance and maximum iterations for optimization
tol e;
maxiter ;
Set initial values for t and t
tinit ; Initial angle for superellipse
tinit ; Initial angle for superellipse
Initial Phase: Gradient Descent to find a good estimate
tgd tgd gradientdescenta b n a b n tol, maxiter, tinit, tinit;
Refinement Phase: Newton's Method for precise convergence
tmin, tmin, mindistance findmindistancea b n a b n tol, maxiter, tgd tgd;
Draw superellipses and closest points
drawsuperellipsesr e e a b n r e e a b n tmin, tmin;
end
function tgd tgd gradientdescenta b n a b n tol, maxiter, tinit, tinit
Initialize parameters for gradient descent
t tinit;
t tinit;
alpha ; Adjusted learning rate for gradient descent
for iter :maxiter
~ grad distanceandgradientt t a b n a b n;
Update parameters using gradient descent
t t alpha grad;
t t alpha grad;
Check convergence
if normgrad tol
break;
end
end
Return the final parameters from gradient descent
tgd t;
tgd t;
end
function tmin, tmin, mindistance findmindistancea b n a b n tol, maxiter, tinit, tinit
Initialize parameters for Newton's method
t tinit;
t tinit;
iter ;
regularizationfactor e; Regularization to prevent singular Hessian
while iter maxiter
d grad, hessian distanceandgradientt t a b n a b n;
Add regularization to Hessian
hessian hessian regularizationfactor eye;
Check if Hessian is valid
if allisfinitehessian: && rankhessian && condhessiane
Calculate optimal update direction using Newton's method
delta hessian grad; Solve linear system
Update parameters with the found step size
t t delta;
t t delta;
else
warningHessian is singular or contains nonfinite values, stopping optimization.;
mindistance d; Return the last computed distance
tmin t;
tmin t;
return; Exit the function
end
Check convergence
if normgrad tol
break;
end
iter iter ; Increment iteration counter
end
Final minimum distance
mindistance d; This is the calculated distance
tmin t;
tmin t;
end. function x y superellipsepointt a b n
Compute x y point on the superellipse given parameter t
x a signcost abscostn;
y b signsint abssintn;
end
function x y superellipsepointsr a b n t
Compute points on the superellipse given parameter t
x y arrayfun@ti superellipsepointti a b n t;
x x r;
y y r;
end
function d grad, hessian distanceandgradientt t a b n a b n
Calculate distance and gradient for Newton's method
x y superellipsepointt a b n;
x y superellipsepointt a b n;
Calculate distance
deltax x x;
deltay y y;
d sqrtdeltax deltay;
Regularization to avoid division by zero
if d e
d e; Set a small distance to avoid NaN
end
Compute gradients
dxdta sint abscostn deltax d;
dydtb cost abssintn deltay d;
dxdta sint abscostn deltax d;
dydtb cost abssintn deltay d;
grad dxdt dydt; dxdt dydt; Gradient vector
Compute Hessian second derivatives
hessian zeros;
For each element of Hessian, compute the appropriate second derivatives
hessianacostnn sint sint
