Question: Consider a neural network with the following structure: Input layer with two neurons and One hidden layer with two neurons and One output layer with

Consider a neural network with the following structure: Input layer with two neurons and One hidden layer with two neurons and One output layer with one neuron The weights between the input and hidden layer are: =0.1=0.2(weights connecting and to )=0.3=0.4(weights connecting and to ) The weights between the hidden layer and output layer are: =0.5=0.6(weights connecting and to ) The activation function for both the hidden layer and output layer is the hyperbolic tangent function: ()= tanh()=The network is trained using binary cross-entropy as the loss function: =1 where is the target value. log+1 log1 You are given the following values for the input and target: =1,=0(target output) Your Task: Draw the computational graph showing all nodes and edges. Perform a forward pass through the network to compute the output . Compute the loss using the given target value. Perform backpropagation to compute the gradients of the loss with respect to weight . Update the weight using gradient descent with a learning rate =0.1

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!