Question: A sigmoid activation function, sigma ( x ) = 1 / ( 1 + e - x ) , having a derivative sigma

A sigmoid activation function, \sigma (x)=1/(1+e-x
), having a derivative
\sigma (x)(1-\sigma (x)), is used in a Multi-Layer Perceptron (MLP) Neural
Network. The MLP has 10 inputs, two hidden layers (the first has
5 neurons and the second has 3 neurons) and 2 outputs. The
backpropagation (BP) algorithm is used for training.
(i) How many weights exist in this MLP neural network?
[1 mark]
(ii) Explain the principles of the backpropagation (BP) learning
algorithm that is used to train the weights of the MLP. Your
explanation needs to show how the Perceptron Delta Rule is used
in the MLP network. [3 marks]
(iii) What are the suggested minimum number of representative
input output training samples required in order to have a high
probability of achieving good generalisation from the MLP
network? [1 mark

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!