Question: Choose the false statements about gradient descent. ( more than one are wrong ) 1 If the learning rate is set to be large, it
Choose the false statements about gradient descent.more than one are wrong
If the learning rate is set to be large, it can lead to divergence.
During training, the weight is adjusted in the direction of positive gradient.
Training can can be done very fast if the learning rate is set to be small.
The learning rate is bigger than zero.
Loss function shifts around the minimum if the learning rate is set to be large.
The gradient is assessed again for the new weight vector in each step.
The weight is updated gradually in the direction of the negative gradient.
Slow convergence is caused by small learning rate.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
