Question: This question provides a simple two-layer neural network with a hidden layer of three neurens, an output layer of one neuron and two input neurons

 This question provides a simple two-layer neural network with a hidden

This question provides a simple two-layer neural network with a hidden layer of three neurens, an output layer of one neuron and two input neurons In both the hidden layer and the output layer, the sigmoid function is considered as an activation function. X=[x1,x2] is a vector that the network receives as input. A parameter vector called W(1)=[w1,w2,w3,w4,w5,w6] connects the input layer to the hidden layer. A parameter vector W(2)=[w7,w8,w9] is the parameter that connects the hidden layer to the output layer (examples shown in the figure below). (a) Write the mathematical expression for the output of the hidden layer, including the bias term. (10%) (b) Write the mathematical expression for the output of the output layer, including the bias term. (10\%) (c) The true label for this input is y=0. The cost function used in this network is the mean squared error (MSE). Write the mathematical expression for the cost for this input. (10\%) (d) Use backpropagation to calculate the gradient descent of the cost with respect to the weights connecting the input layer to the hidden layer [w1,w2,w3,w4,w5,w6] and the weights connecting the hidden layer to the output layer [w7,w8,w9]. (10\%)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!