Suppose you had a neural network with linear activation functions. That is, for each unit the output is some constant times the weighted sum of the inputs.
a. Assume that the network has one hidden layer. For a given assignment to the weights W, write down equations for the value of the units in the output layer as a function of W and the input layer I, without any explicit mention to the output of the hidden layer. Show that there is a network with no hidden units that computes the same function.
b. Repeat the calculation in part (a), this time for a network with any number of hidden layers. What can you conclude about linear activation functions?

  • CreatedFebruary 14, 2011
  • Files Included
Post your question