Question: Which of these statements about mini - batch gradient descent do you agree with? a . Training one epoch ( one pass through the training
Which of these statements about minibatch gradient descent do you agree with?
a
Training one epoch one pass through the training set using minibatch gradient descent is faster than training one epoch using batch gradient descent.
b
One iteration of minibatch gradient descent computing on a single minibatch is faster than one iteration of batch gradient descent.
c
If the minibatch size is m you end up with stochastic gradient descent, which has to process the whole training set before making progress.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
