Question: Consider a neural network with a single hidden layer with m neurons and a sigmoid nonlinearity ( in particular, the nonlinearity is bounded, which excludes

Consider a neural network with a single hidden layer with m neurons and a sigmoid nonlinearity (in
particular, the nonlinearity is bounded, which excludes the RELU). Suppose that all weights of this
neural network (for simplicity, you may disregard bias units) are initialized independently as zero-mean
Gaussian random variables. Each input-to-hidden weight has variance 2, while each hidden-to-output
weight has variance 2m. Prove that the corresponding regression function f:RdR tends to a
Gaussian process as m(the neural network is not trained, only initialized). Write expressions
for the mean and covariance functions of the Gaussian process. Is the Gaussian process stationary?
Is the Gaussian assumption on the weights really necessary?
Hint: Use the multivariate Central Limit Theorem.
 Consider a neural network with a single hidden layer with m

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!