Question: Could you solve the changed problem conditions? Input vector has changed. Please solve it quickly and accurately.. It's urgent. this is changed problem 3. [Backpropagatiun]

Could you solve the changed problem conditions? Input vector has changed.

Please solve it quickly and accurately.. It's urgent.

this is changed problem

Could you solve the changed problem conditions?Could you solve the changed problem conditions?
3. [Backpropagatiun] [15 pts) Consider the following twolayer neural uetwurk: Input Layer Hidden Layer You will use the ReLU function as the activation function at the hidden layer. and the sigmoid activation function at the output layer. Thus, the overall neural network is _ I'J'T __.r .v __. _ 0.]. 0.2 .' _ \".1 at] elfz ReLUUI 1.1?\" where HI [434 0'3] . W2 _ [0'2] ' T _ _ _ _ The input Vector is .1- = [5 4] . All Weights are displayed on the mirage (the superscript denotes the layer). Suppose that the loss function is output} = output. Compute the gradient of this loss with respect to each of the weights. Be sure to show details. As a reminder, the FEM] function and its derivative are: 9: if z E U 0 otherwise 1 if : :3 U U otherwise. HeLU{::) = { aaeLum = { and the sigmoid function and its derivative are: 1 07(3): 1+ exp(:) 620(2} = 0(2) - [1 04(2)). -0.4 Output Sigmoid #0.2 W.?=0.2 W =0.3 ReLU Input Layer Hidden Layer Consider the two-layer neural network shown above. You will use the ReLU function as the activation function to compute activations of the hidden layer , and the Sigmoid activation function on the output. The input vector is x = [10 8]. All weights are displayed on the image (the superscript denotes the layer information). Suppose that the loss function is L(output ) = output . Compute the gradient of this loss with respect to each of the weights. Be sure to show the details of your computational work. As a reminder, the ReLU function and the derivative of the ReLU function are as follows: ReLU (z) = z ifz 2 0 0 otherwise VReLU (z) = 1 ifz 2 0 0 otherwise Forward propagation: X2 = WIX = 2.6 - 1.6 T X3 = W2.ReLU(X2) = W2. 2.6 0 =0.26 y = T - =0.5646 1 + exp( - X3) Backprop: First, the individual gradients: OX 3 ON = (X3)(1 - 6(X3)) = 0 .5646(1 - 0.5646) = 0.2458 Ox3 = ReLU( X2)T = 2.6 0 ow 2 Ox3- = WJ . V ReLU(X2) = 0.1 0 T

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!