Question: Question 1 Given the following neural network with fully connection layer and ReLU activations, including two input units ( i 1 , i 2 )
Question
Given the following neural network with fully connection layer and ReLU activations, including two input units i i four hidden units and The output units are indicated as and their targets are indicated as The weights and bias of fully connected layer are called and with specific subdescriptors.
The values of variables are given in the following table:
tableVariableiiwwwwwwwwbbbbttValue
a Compute the output with the input and network parameters as specified above. Write down all calculations, including intermediate layer results.
b Compute the mean squared error of the output calculated above and the target
c Update the wight w using gradient descent with learning rate as well as the loss computed previously. Please write down all your computations.Given the following neural network with fully connection layer and ReLU activations, including two input units i i four hidden units h h and h h The output units are indicated as o o and their targets are indicated as t t The weights and bias of fully connected layer are called w and b with specic subdescriptors
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
