Question: The function f(x, y) = (x + y-11) + (x+y-7) is known as Himmelblau's function which is a function designed for testing per- formance

The function f(x, y) = (x + y-11) + (x+y-7) is known as Himmelblau's function which is a function designed for testing per- formance of optimization algorithms (you can see more examples here). Writeup Problem 1 is about visualizing this function, so if you are having trouble seeing what's happening you could work on visualizing it. Himmelblau's function has multiple local minima. In this problem we will find one of those local minima. (a) Create an anonymous function that computes f using only one input, an ar- ray with two elements: [y]. To make sure that your function is defined correctly, compute f(3, 4) (using the array [3, 4]) and save the answer to the variable A5. (b) Use scipy. optimize. fmin with an initial guess of [-3 -2] to find the argmin of f (the array that minimizes f) and save it to the variable A6. (c) Create an anonymous function (with only one input) that calculates the gra- dient Vf(x, y). The gradient is given by the following formula Vf(x, y) = Recall that the gradient of f should be zero at a local minimum. Calculate Vf(x, y) at the r and y values found in part (b) and save the result to the variable A7. Then save the 2-norm of A7 to the variable A8. In other words, if you found (x, y) = (1, 1) above, save ||Vf(x1, y)||2 (423-42x + 4xy + 2y2-14) 4y-26y + 4xy + 2x - 22) (2) to the variable A8. The 2-norm is calculated in python via np.linalg.norm. (d) Recall from class that we wrote the following code to perform one step of gradient descent (given on next page). p = np.array ( [6, 4]) # Choose an initial guess gradf (p) # Find which direction to go lambda t: p t grad # Define the path grad phi f_of_phi lambda t: f(phi(t)) # Create a function of # "heights along path" tmin=scipy. optimize. fminbound (f_of_phi,0,1) # Find time it takes # to reach min height. p phi (tmin); # Find the point on the path and update your guess Adapt the code so that it performs up to 2000 iterations of gradient descent and stops if the 2-norm of the vector grad is less than a tolerance of 10-7. (e) Now use your gradient descent code to find the argmin of the function f(x, y). Use an initial guess of (x, y) = (-3,-2). Save the final result to the variable A9 and the number of iterations to the variable A10. The initial guess does not count as an iteration.
Step by Step Solution
3.48 Rating (151 Votes )
There are 3 Steps involved in it
a Create the function that computes fx y def fx y return x 2 y 11 2 x y 2 7 2 Compute f3 4 A15 f3 4 printA15 A15 b Create the function to find the gradient of f def gradientfx y dfdx 2 2 x x 2 y 11 x ... View full answer
Get step-by-step solutions from verified subject matter experts
