Question: Note that: The ReLU activation function is defined as follows: For an input x , ReLU ( x ) = max ( 0 , x

Note that:
The ReLU activation function is defined as follows:
For an input x,ReLU(x)=max(0,x)
The Softmax function is defined as follows:
Given the inputs, xi,i=1,dots,n; the outputs are as follows:
s(xi)=exij=1nexj
Suppose that the weights of the network are
w11=-0.5; w12=-1.0,w21=1.2; w22=1.0; w31=-0.6; w32=1.0
h11=-0.5; h12=-1.0; h21=1.0,h22=1.5; h31=0.4; h32=0.3
For an input of x1=1.0 and x2=1.0
i. Calculate the net inputs a1 and a2.
ii. Calculate the ReLU outputs.
iii. Calculate y1 and y2.
iv. Calculate z1 and z2.
 Note that: The ReLU activation function is defined as follows: For

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Chemical Engineering Questions!