Question: Let a simple two layer neural network be defined as: z = W(1)x h = ReLU(z) y = W(2)h Find dJ/dW(1). To solve this question

Let a simple two layer neural network be defined as:

z = W(1)x

h = ReLU(z)

y = W(2)h

Find dJ/dW(1).

To solve this question please find the above using backpropagation making sure that all matrices are correctly transposed or not. Note that the input x is a 3x1 matrix, y is a 3x1 matrix and each of the two weight matrices W(1) and W(2) are 3x3 matrices.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!