Question: You run gradient descent for 1 5 iterations with a = 0 . 3 and compute J ( theta ) after each iteration. You find
You run gradient descent for iterations with a and compute Jtheta after each iteration. You find that the value of JTheta decreases quickly and then levels off. Based on this, which of the following conclusions seems most plausible?
Question Answer
a
All of the given choices
b
Use a larger value of a say a
c
Use a smaller value of a say a
d
a is an effective choice of learning rate
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
