Question: Consider a fully connected neural network that has an input layer, one hidden layer and an output layer. The input layer has inputs x 1

Consider a fully connected neural network that has an input layer, one hidden layer and an output layer. The input layer has inputs x1 and x2, while the output layer has one neuron that uses the Softmax activation function and produces two outputs y1(2) and y2(2). The hidden layer has 3 neurons and all of them use the ReLu activation function. The different notations used in the neural network are shown below:
z1=w11x1+w21x2,z2=w12x1+w222x2,z3=w13x1+w23x2,
y1(1)=ReLu(z1),y2(1)=ReLu(z2),y3(1)=ReLu(z3)
s1=y1(1)+y2(1),s2=y2(1)+y3(1),
y1(2)=exps1{exp(s1)+exp(s2)y2(2)=exps2exp(s1)+exp(s2)
Suppose that x1=3,x2=5,w11=-10,w21=7,w12=2,w22=5,w13=4,w23=-4. Compute the following gradients delz2delx1,delz1delw21, and dels1dely1(1)
Consider a fully connected neural network that

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!