Question: Consider a specific 2 hidden layer ReLU network with inputs xinR, 1 dimensional outputs, and 2 neurons per hidden layer. This function is given by

Consider a specific 2 hidden layer ReLU network with inputs xinR,1
dimensional outputs, and 2 neurons per hidden layer. This function is
given by
h(x)=W(3)max{0,W(2)max{0,W(1)x+vec(b)(1)}+b(2)}+b(3)
where the max is element-wise, with weights:
W(1)=[1.5]
0.5
b(1)=[0]
1
W(2)=[121]
2
b(2)=[0]
1
W(3)=[11]
b(3)=-1
An interesting property of networks with piece-wise linear activations
like the ReLU is that on the whole they compute piece-wise linear
functions. At each of the following points x=xo, determine the value
of the new weight WinR and bias binR such that dh(x)dx|x=x0=W and
Wxo+b=h(xo).
xo=2
xo=-1
xo=1
 Consider a specific 2 hidden layer ReLU network with inputs xinR,1

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!