Question: Gradient Descent. Please refer to the code below. provided along with this assignment. ( 2 marks ) Repeat the exercise in ( Q 3 )

Gradient Descent. Please refer to the code below.
provided along with this assignment. (2 marks)
Repeat the exercise in (Q3) for the loss function given below:
L =3 x^2+2 y^2+20 cos(x) cos(y)
Modify the file assign2 q3.py only in-between the lines marked with comment as described in Q3. Do not modify any other line of the script.
Deliverables: A) A screenshot of the 3D plot obtained when you execute the
python script after adding your lines of code. B) A screenshot of the console
output obtained obtained when you execute the python script after adding
your lines of code. C) The equations of the Loss function and gradient of
loss function on which the gradient descent was performed. D) A snapshot
of the lines of code that was added by you in the designated places. E) State
whether the algorithm converged to the global minima, a local minima or if
it failed to converge in your case.
import numpy as np
import matplotlib.pyplot as plt
import random
##### To be Updated #####
# e.g.,if your BITS email id is 023ab12345@wilp.bits-pilani.com
# update the below line as student_id ="023xx12345"
student_id =
#########################
student_id =''.join([i for i in student_id if i.isdigit()])
random.seed(student_id)
# set the number of iterations and learning rate
iters = random.randint(100,300)
learning_rate =0.01
# Evaluate the function at x
def C(x):
##### To be Updated #####
# NOTE: return value of this function will
# **ALSO** change for Q4 of the assignment
#########################
return (x.T@np.array([[2,1],[1,20]])@x)-(np.array([5,3]).reshape(2,1).T@x)
# Evaluate the gradient of function at x
def dC(x):
##### To be Updated #####
# 1. Compute and return the gradient
return
#########################
def plot_grad_change(X,Y,Z, c, grad_xs0, grad_xs1, grad_ys):
fig = plt.figure()
title_str = "Gradient Descent:"+"lr="+str(learning_rate)
plt.title(title_str)
ax = fig.add_subplot(projection='3d')
ax.plot_surface(X, Y, Z, cmap=plt.cm.YlGnBu_r,alpha=0.7)
for i in range(len(grad_xs0)):
ax.plot([grad_xs0[i]],[grad_xs1[i]], grad_ys[i][0], markerfacecolor='r', markeredgecolor='r', marker='o', markersize=7)
ax.text(grad_xs0[-1],grad_xs1[-1],grad_ys[-1][0][0],
"("+str(round(grad_xs0[-1],2))+","+
str(round(grad_xs1[-1],2))+"),"+
str(round(grad_ys[-1][0][0],2)))
plt.show()
def GD(start,x,y,z, c, dc, iters, eta):
px = start.astype(float)
py = c(px).astype(float)
print("GD Start Point:",px,py)
print("Num steps:",iters)
grad_xs0, grad_xs1, grad_ys =[px[0][0]],[px[1][0]],[py]
for iter in range(iters):
##### To be Updated #####
# 2. Update px using gradient descent
px =
# 3. Update py
py =
#########################
grad_xs0.append(px[0][0])
grad_xs1.append(px[1][0])
grad_ys.append(py)
print("Converged Point:",px,py)
plot_grad_change(x,y,z, c, grad_xs0,grad_xs1, grad_ys)
lo =-10
hi =10
x1= round(random.uniform(lo,0),4)
x2= round(random.uniform(lo,0),4)
x = np.linspace(lo,1, hi)
y = np.linspace(lo,1, hi)
X, Y = np.meshgrid(x, y)
Z = np.zeros_like(X)
for i in range(X.shape[0]):
for j in range(X.shape[1]):
Z[i][j]= C(np.array([X[i][j],Y[i][j]]).reshape(2,1))
# start Gradient Descent
GD(np.array([x1,x2]).reshape(2,1),X,Y,Z, C, dC, iters, learning_rate)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!