Question: Consider the following network structure. You can assume the initial weights. Assume bias to be zero for easier computations. Given that < x 1 ,

Consider the following network structure. You can assume the initial weights. Assume bias to be zero for easier computations. Given that < x1, x2,y1,y2>=<1,1,0,1> where y is the target. Assume \beta =0.9 and \eta =0.01.
x1====>h1====>y1
x1====>h2====>y1
x1====>h1====>y2
x2====>h1====>y2
x2====>h2====>y1
x2====>h2====>y2
(a) Compute the forward propagation and generate the output. Use Relu for hidden layers and Sigmoid activation function for output layer.
(b) Compute the Softmax loss function for both outputs.
(c) Let the initial weights that assumed be the weights [at time (t-1). Compute the weights v21, w12 and w22 at time t using SGD.
(d) Let the weight at time t be the ones computed in part (c). Compute the weights v21, w12 and w22 at (t +1) when momentum is used.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!