Question: Back - propagation Algorithm 2 . Back - propagation Algorithm Once we set up the architecture of our ( feedforward ) neural network, our goal
Backpropagation Algorithm Backpropagation Algorithm Once we set up the architecture of our feedforward neural network, our goal will be to find weight parameters that minimize our loss function. We will use the stochastic gradient descent algorithm which you learned in Lecture and revisited in lecture to carry out the optimization. This involves computing the gradient of the loss function with respect to the weight parameters. Since the loss function is a long chain of compositions of activation functions with the weight parameters entering at different stages, we will break down the computation of the gradient into different pieces via the chain rule; this way of computing the gradient is called the backpropagation algorithm. In the following problems, we will explore the main step in the stochastic gradient descent algorithm for training the following simple neural network from the video:This simple neural network is made up of L hidden layers, but each layer consists of only one unit, and each unit has activation function f As usual, x is the input, zi is the weighted combination of the inputs to the it h hidden layer. In this onedimensional case, weighted combination reduces to products:
zx w
for i L: zi fi wi where fi
We will use the following loss function:
y fLyfL
where y is the true value, and fL is the output of the neural network.and the true label y Let delta i zi In this problem, we derive a recurrence relation between delta i and delta i Assume that f is the hyperbolic tangent function:
fxtanh x
fxtanh x
Which of the following option is the correct expression for delta in terms of delta
delta f wdelta
delta f wdelta
delta f wdelta
delta f wdelta
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
