Question: Choose the false statements about gradient descent. ( more than one are wrong ) 1 If the learning rate is set to be large, it

Choose the false statements about gradient descent.(more than one are wrong)
1 If the learning rate is set to be large, it can lead to divergence.
2 During training, the weight is adjusted in the direction of positive gradient.
3Training can can be done very fast if the learning rate is set to be small.
4 The learning rate is bigger than zero.
5 Loss function shifts around the minimum if the learning rate is set to be large.
65 The gradient is assessed again for the new weight vector in each step.
7 The weight is updated gradually in the direction of the negative gradient.
8 Slow convergence is caused by small learning rate.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!