Question: Problem 1 . Consider a simple two - layer neural network with one input layer with two nodes, one hidden layer with two nodes, and
Problem Consider a simple twolayer neural network with one input layer with two nodes, one hidden
layer with two nodes, and one output layer with a single node which we saw in the previous assignment. The
activation function for the first layer is ReLU and for the second layer is a sigmoid function sze
z
The information in the network propagates by the following rules:
At the input later, you are given a dimensional vector x
Then the dimensional vector z
is computed by multiplying by times matrix W and adding a
dimensional vector of biases b
:
z
Wx
b
Then the values of the second layer x
are computed by applying applying the activation function
sz componentwise to the vector z
:
x
ReLUz
Then a scalar value z
is computed by multiplying by times matrix W and adding a scalar bias b
:
z
Wx
b
Finally, the output value x
is computed by applying applying the activation function sz to the
variable z
:
x
sz
You are given the current values of the parameters and the gradients of the loss function:
W
b
W
b
C
W
C
b
C
W
C
b
You want to do one step of the gradient descent method with the learning rate k
for the parameters
in the layer, and the learning rate k
for the parameters in layer.
What are the new values of weights and biases W
b
W
b
What is the new prediction of the network for the initial input value x
T
How to find the new weights and biases
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
