Question: Please show all work. Code should be done in R. Use gradient descent with backtracking line search to minimize f(x_1, x_2)=2x^4_1 + 3x^3_1 + 2x^2_1

Please show all work. Code should be done in R.
Use gradient descent with backtracking line search to minimize f(x_1, x_2)=2x^4_1 + 3x^3_1 + 2x^2_1 + x^2_2-4x_1 x_2. Choose the initial value x^(1) = (1, 1) and alpha = 0.4 and beta = 0.8 in backtracking line search. Submit your code and plot x^(t)_i against the number of iterations t for i = 1, 2. What is the Hessian at the minimizer
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
