Question: a ) In a simple Multi - Layer - Perceptron neural network model with 8 neurons in the input layer, 5 neurons in the hidden

a) In a simple Multi-Layer-Perceptron neural network model with 8 neurons in the input layer, 5 neurons in the hidden layer and 1 neuron in the output layer. What is the size of the weight matrices between hidden output layer and input hidden layer?
20%
b) Assume a simple Feed Forward model with 3 neurons and inputs =1,2,3. The weights to the input neurons are 4,5 and 6 respectively. Assume the activation function is a linear constant value of 3. What will be the output?
20%
c) What steps can we take to prevent overfitting in a Neural Network?
20%
d) The recurrent neural network shown in Figure Q2 has 1 hidden layer.
The weights on the connection between nodes is denoted by w as indicated in the following matrices:
w12=[0.40.1-0.60.62.01.0],w23=[0.10.50.40.2],wf=[0.01-0.02]
Assume the hidden and output layers neurons have a sigmoid activation function given as follows:
f(x)=11+exp(-x)
If the network is tested with an input vector x(1)=[1.0,2.0,0.1], and x(2)=[2.0,2.2,0.4], calculate the predicted outputs of the network Y' and t=1 and t=2;
40
3.
a) Explain the meaning of Leaky ReLU activation function used in Deep Neural Networks. Give the mathematical expression of this function, and its graphical representation.
 a) In a simple Multi-Layer-Perceptron neural network model with 8 neurons

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!