Question: Consider the neural network shown below: Z 2 Z W The weight matrix, W, is: [1, 1, -1, 0.5, 1, 2]. Assume that the

Consider the neural network shown below: Z 2 Z W The weight matrix, W, is: [1, 1, -1, 0.5, 1, 2]. Assume that

Consider the neural network shown below: Z 2 Z W The weight matrix, W, is: [1, 1, -1, 0.5, 1, 2]. Assume that the hidden layer uses RelU and the output layer uses Sigmoid activation function. Assume squared error loss, i.e., Loss = (y - y). The input x = 4, and the output y = 0. Using this information, answer the questions below: (Show all work, and all answers should be rounded to 2 decimal places OR POINTS WILL BE TAKEN OFF!) (a) [2 points] Use forward propagation to compute the predicted output. (b) [1 point] What is the loss or error value? (c) [4 points] Using backpropagation, compute the gradient of the weight vector, that is, compute the partial derivative of the error with respect to all of the weights.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock

Part a Forward propagation Compute the weighted sum of the inputs in the hidden layer Z1 4 1 4 1 1 0... View full answer

blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!