Question: 2. Consider the following two-layer network. (1) X2) The weights and bias for X() is (5,-3) and 2, respectively. The weights and bias for


2. Consider the following two-layer network. (1) X2") The weights and bias for X() is (5,-3) and 2, respectively. The weights and bias for 

2. Consider the following two-layer network. (1) X2") The weights and bias for X() is (5,-3) and 2, respectively. The weights and bias for X() is (-2,6) and -1, respectively. The weights and bias for Y are (1,1) and 1, respectively. The activation function is ReLU for X1 () and X2 (1) and identity function for Y. We have two data points: (X11=1, X12=1, y1=5) and (X21=1, X22=-1, y2=8). (a) Calculate the prediction for two data points using the above network. Calculate the total square loss value. Show the details for your calculation. (b) Now we add a dropout with rate 0.5 to the first hidden layer. Assume X2 () is dropped when making the prediction for data point (X11=1, X12=1,y1=5) and X1() is dropped when making the prediction for data point (X21=1, X22=-1,y2=8). Calculate the prediction for two data points and the total square loss value. Show the details for your calculation.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!