Question: Please help me with task 1 and 2 ! Problem 2 : Backpropagation Consider a neural network with one input layer, one hidden layer, and

Please help me with task 1 and 2!
Problem 2: Backpropagation
Consider a neural network with one input layer, one hidden layer, and one output layer. The input layer has two neurons, the hidden layer has two neurons (plus a bias neuron), and the output layer has one neuron. The activation function is the sigmoid function, (x)=11+e-x.
1
Network Structure
Inputs: x1,x2.
Weights from input to hidden layer: w11,w12,w21,w22.
Weights from hidden to output layer: w31,w32.
Bias weights: b1,b2 for hidden layer, b3 for output layer.
Desired output: d.
Loss function: L=12(y-d)2, where y is the network output.
If you are more comfortable with matrix-vector multiplication, then the hidden layer translates x=(x1,x2)TT to h=(Wx+b), where W is a 22 matrix with entries equal to the weights in the hidden layer, and b=(b1,b2)TT. The output layer outputs d=(w31,w32)h+b3.
Tasks
Calculate the output of the network (y) as a function of inputs, weights, and biases.
Derive the expressions for the gradients of the loss function with respect to each weight using backpropagation (delLdelwij).
Please help me with task 1 and 2 ! Problem 2 :

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!