Question: A sigmoid activation function, sigma ( x ) = 1 / ( 1 + e - x ) , having a derivative sigma
A sigmoid activation function, sigma xex
having a derivative
sigma xsigma x is used in a MultiLayer Perceptron MLP Neural
Network. The MLP has inputs, two hidden layers the first has
neurons and the second has neurons and outputs. The
backpropagation BP algorithm is used for training.
i How many weights exist in this MLP neural network?
mark
ii Explain the principles of the backpropagation BP learning
algorithm that is used to train the weights of the MLP Your
explanation needs to show how the Perceptron Delta Rule is used
in the MLP network. marks
iii What are the suggested minimum number of representative
input output training samples required in order to have a high
probability of achieving good generalisation from the MLP
network? mark
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
