Question: Neural Networks implementation of mini batch SGD in python? def mini_batch_gradient(param, x_batch, y_batch, reg_lambda): implement the function to compute the mini batch gradient input: param
Neural Networks implementation of mini batch SGD in python?
def mini_batch_gradient(param, x_batch, y_batch, reg_lambda): """implement the function to compute the mini batch gradient input: param -- parameters dictionary (w, b) x_batch -- a batch of x (size, 784) y_batch -- a batch of y (size,) reg_lambdba -- regularization parameter output: dw -- derivative for weight w db -- derivative for bias b batch_loss -- average loss on the mini-batch samples """ # Your code goes here
return dw, db, batch_loss
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
