Question: Python - Machine Learning Question: please help me with question (a) and (b), Thank you!!! Problem 5: Gradient Descent and the learning rate a) By
Python - Machine Learning Question: please help me with question (a) and (b), Thank you!!!
Problem 5: Gradient Descent and the learning rate
a) By modifying the learning rate below, show how the convergence takes longer or doesn't converge at all. Can you explain in words or math why this is? (Modify the following code, and give answers, use words or math to explain it.)
=====================Code Chunk =======================
from numpy import *
# y = mx + b # m is slope, b is y-intercept def compute_error_for_line_given_points(b, m, points): totalError = 0 for i in range(0, len(points)): x = points[i, 0] y = points[i, 1] totalError += (y - (m * x + b)) ** 2 return totalError / float(len(points))
def step_gradient(b_current, m_current, points, learningRate): b_gradient = 0 m_gradient = 0 N = float(len(points)) for i in range(0, len(points)): x = points[i, 0] y = points[i, 1] b_gradient += -(2/N) * (y - ((m_current * x) + b_current)) m_gradient += -(2/N) * x * (y - ((m_current * x) + b_current)) new_b = b_current - (learningRate * b_gradient) new_m = m_current - (learningRate * m_gradient) return [new_b, new_m]
def gradient_descent_runner(points, starting_b, starting_m, learning_rate, num_iterations): b = starting_b m = starting_m for i in range(num_iterations): b, m = step_gradient(b, m, array(points), learning_rate) return [b, m]
def run(num_iterations): points = genfromtxt("../data/data.csv", delimiter=",") learning_rate = 0.0001 initial_b = 0 # initial y-intercept guess initial_m = 0 # initial slope guess num_iterations = num_iterations print("Starting gradient descent at b = {0}, m = {1}, error = {2}".format(initial_b, initial_m, compute_error_for_line_given_points(initial_b, initial_m, points))) print("Running...") [b, m] = gradient_descent_runner(points, initial_b, initial_m, learning_rate, num_iterations) print("After {0} iterations b = {1}, m = {2}, error = {3}".format(num_iterations, b, m, compute_error_for_line_given_points(b, m, points))) for i in range(0,len(points)): plt.scatter(points[i,0],points[i,1]) plt.scatter(points[i,0],m*points[i,0]+b,color='r')
run(10)
=====================================================================
======================Code Chunk =====================
# Use this to visually discuss convergence rate based on learning rate
#for num in range(0,10): # run(num) # plt.show()
=================================================

b) [10 points] Plot the error as a function of the number of iterations for various learning rates. Choose the rates so that it tellsa story. # Code here
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
