Question: function [ optimal _ x , optimal _ y ] = gradient _ opti _ line _ search ( h , int, tol, w )

function [optimal_x, optimal_y]=gradient_opti_line_search(h, int, tol, w)% Function definitions f1= @(x) x^2+ x +1; f2= @(x) x +1.5; % Cost function costFunc=@(x, w) w(1)*f1(x)+ w(2)*f2(x); % Gradient of the cost function w.r.t x gradient = @(x, w)2*x*w(1)+ w(1)+ w(2); % Initialize variables x = int(1); % Start at the beginning of interval alpha = h; % Learning rate max_iter =1000; % Maximum number of iterations iter =0; % Iteration counter % Gradient descent with line search while true % Calculate the gradient grad = gradient(x, w); % Update x using line search t = alpha; % Start with the maximum step size while costFunc(x-t*grad,w)> costFunc(x, w)- t*tol*grad^2 t = t/2; % Reduce the step size
CW1_1 Gradient Based Optimisation
Write a function that find the optimal value of the given cost function below. The cost function is a weight combination of the two functions and. Use a line search to improve the efficiency of the algorithm. Write the program so that the weightings and can be changed.
Use backward difference for the gradients. Take h as the maximum step size and use fractions of this for the line search. w is a vector of and . and should sum to 1 and both be positive.
y1=f1(x)=x2+x+1
y2=f2(x)=x+1.5
cost=f1w1+f2w2
function [ optimal _ x , optimal _ y ] = gradient

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!