Question: Consider a convex function with a unique global minimum. If we initialize gradient descent at a point far from the minimum and use a fixed
Consider a convex function with a unique global minimum. If we initialize gradient descent at a point far from the minimum and use a fixed learning rate, which of the following is most likely to happen as the number of iterations increases?
Question Answer
A
The algorithm will diverge.
B
The algorithm will converge to the global minimum at a constant rate.
C
The algorithm will initially converge quickly, then slow down as it approaches the minimum.
D
The algorithm will oscillate around the minimum without converging.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
