Question: a) [8 points] Say that the network is using linear units: that is, the output of a unit is CW A for some fixed constant

 a) [8 points] Say that the network is using linear units:

a) [8 points] Say that the network is using linear units: that is, the output of a unit is CW A for some fixed constant C. Let the weight values wi be fixed. Re-design the neural network to compute the same function without using any hidden units. Express the new weights in terms of the old weights and the constant C. b) [4 points] Is it always possible to express a neural network made up of only linear units without a hidden layer? Justify your answer. c) [8 points] Another common activation function is a threshold, where the activation is t(W A), where t(x) is 1 if > > 0 and 0 otherwise. Let the hidden units use sigmoid activation functions (activation function of U is then (1 +exp(W A))") and let the output unit use a threshold activation function. Find weights which cause this network to compute the XOR of X, and X2 for binary-valued X1 and X2. Keep in mind that there is no bias term for these units

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!