Question: Question 1 Given the following neural network with fully connection layer and ReLU activations, including two input units ( i 1 , i 2 )

Question 1
Given the following neural network with fully connection layer and ReLU activations, including two input units (i1, i2), four hidden units (h1,h2) and (h3,h4. The output units are indicated as (o1,o2) and their targets are indicated as (t1,t2). The weights and bias of fully connected layer are called w and b with specific sub-descriptors.
The values of variables are given in the following table:
\table[[Variable,i_(1),i_(2),w_(11),w_(12),w_(21),w_(22),w_(31),w_(32),w_(41),w_(42),b_(1),b_(2),b_(3),b_(4),t_(1),t_(2)],[Value,2.0,-1.0,1.0,-0.5,0.5,-1.0,0.5,-1.0,-0.5,1.0,0.5,-0.5,-1.0,0.5,1.0,0.5]]
(a) Compute the output (01,02) with the input (i1,i2) and network parameters as specified above. Write down all calculations, including intermediate layer results.
(b) Compute the mean squared error of the output (o1,o2) calculated above and the target (t1,t2.
(c) Update the wight w21 using gradient descent with learning rate 0.1 as well as the loss computed previously. (Please write down all your computations.)Given the following neural network with fully connection layer and ReLU activations, including two input units (i1, i2), four hidden units (h1, h2) and (h3, h4). The output units are indicated as (o1, o2) and their targets are indicated as (t1, t2). The weights and bias of fully connected layer are called w and b with specic sub-descriptors
Question 1 Given the following neural network

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!