Question: Consider a two-layer feedforward network with two inputs, one hidden neuron, and one output neuron. All neurons use the logistic activation function. Use the BP
Consider a two-layer feedforward network with two inputs, one hidden neuron, and one output neuron. All neurons use the logistic activation function. Use the BP algorithm with momentum to update the weights of networks after each of the training examples ((1, 0), 1} and {(0, 1), 0}. Assume all weights are initially equal to 1, n = 0.2, and a = 0.9. Show your working.
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
