Question: Gradient Descent. Please refer to the code below. provided along with this assignment. ( 2 marks ) Repeat the exercise in ( Q 3 )
Gradient Descent. Please refer to the code below.
provided along with this assignment. marks
Repeat the exercise in Q for the loss function given below:
L x y cosx cosy
Modify the file assign qpy only inbetween the lines marked with comment as described in Q Do not modify any other line of the script.
Deliverables: A A screenshot of the D plot obtained when you execute the
python script after adding your lines of code. B A screenshot of the console
output obtained obtained when you execute the python script after adding
your lines of code. C The equations of the Loss function and gradient of
loss function on which the gradient descent was performed. D A snapshot
of the lines of code that was added by you in the designated places. E State
whether the algorithm converged to the global minima, a local minima or if
it failed to converge in your case.
import numpy as np
import matplotlib.pyplot as plt
import random
##### To be Updated #####
# egif your BITS email id is ab@wilp.bitspilani.com
# update the below line as studentid xx
studentid
#########################
studentid joini for i in studentid if iisdigit
random.seedstudentid
# set the number of iterations and learning rate
iters random.randint
learningrate
# Evaluate the function at x
def Cx:
##### To be Updated #####
# NOTE: return value of this function will
# ALSO change for Q of the assignment
#########################
return xT@nparray@xnparrayreshapeT@x
# Evaluate the gradient of function at x
def dCx:
##### To be Updated #####
# Compute and return the gradient
return
#########################
def plotgradchangeXYZ c gradxs gradxs gradys:
fig pltfigure
titlestr "Gradient Descent:"lrstrlearningrate
plttitletitlestr
ax fig.addsubplotprojectiond
axplotsurfaceX Y Z cmappltcmYlGnBuralpha
for i in rangelengradxs:
axplotgradxsigradxsi gradysi markerfacecolorr markeredgecolorr markero markersize
axtextgradxsgradxsgradys
strroundgradxs
strroundgradxs
strroundgradys
pltshow
def GDstartxyz c dc iters, eta:
px start.astypefloat
py cpxastypefloat
printGD Start Point:",pxpy
printNum steps:",iters
gradxs gradxs gradys pxpxpy
for iter in rangeiters:
##### To be Updated #####
# Update px using gradient descent
px
# Update py
py
#########################
gradxsappendpx
gradxsappendpx
gradysappendpy
printConverged Point:",pxpy
plotgradchangexyz c gradxsgradxs gradys
lo
hi
x roundrandomuniformlo
x roundrandomuniformlo
x nplinspacelo hi
y nplinspacelo hi
X Y npmeshgridx y
Z npzeroslikeX
for i in rangeXshape:
for j in rangeXshape:
Zij CnparrayXijYijreshape
# start Gradient Descent
GDnparrayxxreshapeXYZ C dC iters, learningrate
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
