Question: function [ optimal _ x , optimal _ y ] = gradient _ opti _ line _ search ( h , int, tol, w )
function optimalx optimalygradientoptilinesearchh int, tol, w Function definitions f @x x x ; f @x x ; Cost function costFunc@x w wfx wfx; Gradient of the cost function wrt x gradient @x wxw w w; Initialize variables x int; Start at the beginning of interval alpha h; Learning rate maxiter ; Maximum number of iterations iter ; Iteration counter Gradient descent with line search while true Calculate the gradient grad gradientx w; Update x using line search t alpha; Start with the maximum step size while costFuncxtgradw costFuncx w ttolgrad t t; Reduce the step size
CW Gradient Based Optimisation
Write a function that find the optimal value of the given cost function below. The cost function is a weight combination of the two functions and. Use a line search to improve the efficiency of the algorithm. Write the program so that the weightings and can be changed.
Use backward difference for the gradients. Take h as the maximum step size and use fractions of this for the line search. w is a vector of and and should sum to and both be positive.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
