Question: Given the following neural network with fully connection layer and ReLU activations, including two input units ( 1 1 , 1 2 ) , four

Given the following neural network with fully connection layer and ReLU activations, including two input units (11,12), four hidden units (h1, h2) and (h3, h4). The output units are indicated as (01,02) and their targets are indicated as (t1, t2). The weights and bias of fully connected layer are called w and b with specific sub-descriptors.
bi
b3
21
W11
h1
ReLU- h3
W31-
01
w32
W12.
W41
W21
W22
h2
ReLU ha
W42
02
22
The values of variables are given in the following table:
Variable i Value
i2 W11 W12 W21 W22 W31 W32 W41 W42 b1 ba t1 t22.0-1.01.0-0.50.5-1.00.5-1.0-0.51.00.5-0.5-1.00.51.00.5
b2.
(a)(3 points) Compute the output (01,02) with the input (1,2) and network paramters as specified above. Write down all calculations, including intermediate layer results.
Solution:
Forward pass:
h1ixw11+ i2\times w21+ b1=2.0 x 1.01.0 x 0.5+0.5=2.0 h2i1w12+ i2\times w22+ b2=2.0 x -0.5-1.0x-1.0-0.5=-0.5 h= max(0, h1)= h1=2
h = max(0, h2)=0
01h3w31+ h4\times W41+b3=2 x 0.5+0x-0.5-1.0=0
02h3 W32+ h4\times W42+ b1=2x-1.0+0 x 1.0+0.5-1.5

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Finance Questions!