Question: Given a two - layer neural network and associated weights as shown below: Assuming that the two hidden layer processing neurons ( i . e

Given a two-layer neural network and associated weights as shown below:
Assuming that the two hidden layer processing neurons (i.e., neurons 3 and 4) use a ReLU activation function, and the output layer processing neuron (i.e., neuron 5) uses a logistic (sigmoid) action function:
(a) Calculate the output of neuron 5 for the input vector: neuron 1=2 and neuron 2=3
(b) If the ground truth label of the above input vector is 1, measure the cross entropy loss of neuron 5 and update all weights in the network by using backpropagation optimization and learning rate =0.1. Hint: please check the weight update formula in the last few slides of lecture 3.
(c) Calculate the output of neuron 5 for the same input vector using the new weights.
 Given a two-layer neural network and associated weights as shown below:

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!