Question: A deep neural network with given weights and biases is shown in the top figure. The activation function is the ReLU function (see the
A deep neural network with given weights and biases is shown in the top figure. The activation function is the ReLU function (see the insert). When (x1, x2) = (0, 1) and (1, -1), what will the outputs (z1,z2) be, respectively? (Note: to show your work, also write down the hidden layer output (y1, y2), and (y3, y4) in your solutions). x1 x2 - 0 -1 -2 -1 RELU D 1 0 yl 2 -1 y2 -2 -1 0 0 y3 y4 3 -1, -1 4 -2 2 z1 22
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
