Question: Given the following neural network with fully connection layer and ReLU activations, including tw units ( 1 1 , 1 2 ) , four hidden

Given the following neural network with fully connection layer and ReLU activations, including tw units (11,12), four hidden units (h1, h2) and (h3, h4). The output units are indicated as (01,02) and their targets are indicated as (t1, t2). The weights and bias of fully connected layer are called w and b with specific sub-descriptors.
bi
b3
21
W11
h1
ReLU- h3
W31-
01
w32
W12.
W41
W21
W22
h2
ReLU ha
W42
02
22
The values of variables are given in the following table:
Variable i Value
i2 W11 W12 W21 W22 W31 W32 W41 W42 b1 ba t1 t22.0-1.01.0-0.50.5-1.00.5-1.0-0.51.00.5-0.5-1.00.51.00.5
b2.
compute output (o1,o2) with the input (i1,i2) and network pamters as specified above. Write down all calculations, including intermediate layer results.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Finance Questions!