Question: a) [8 points] Say that the network is using linear units: that is, the output of a unit is CW A for some fixed constant
![a) [8 points] Say that the network is using linear units:](https://dsd5zvtm8ll6.cloudfront.net/si.experts.images/questions/2024/10/671391e196c0e_777671391e17f9ba.jpg)
a) [8 points] Say that the network is using linear units: that is, the output of a unit is CW A for some fixed constant C. Let the weight values wi be fixed. Re-design the neural network to compute the same function without using any hidden units. Express the new weights in terms of the old weights and the constant C. b) [4 points] Is it always possible to express a neural network made up of only linear units without a hidden layer? Justify your answer. c) [8 points] Another common activation function is a threshold, where the activation is t(W A), where t(x) is 1 if > > 0 and 0 otherwise. Let the hidden units use sigmoid activation functions (activation function of U is then (1 +exp(W A))") and let the output unit use a threshold activation function. Find weights which cause this network to compute the XOR of X, and X2 for binary-valued X1 and X2. Keep in mind that there is no bias term for these units
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
