Question: (b) Consider the multi-layer perceptron shown in Fig. 4.2. Use back propagation algorithm to find updated values for weights w_4 and w_8, given the inputs
(b) Consider the multi-layer perceptron shown in Fig. 4.2. Use back propagation algorithm to find updated values for weights w_4 and w_8, given the inputs (x_1 = 0.5, x_2 = 0) and the corresponding desired outputs (d_1 = 0, d_2 = 1). y_o1 and y_o2 are the outputs from the two neurons in the output layer. Assume that the error function is, E = 1/2 sigma^2_i = 1 [e_i]^2, where e_1 = d_1 - y_o1 and e_2 = d_2 - y_o2, the learning rate parameter is, eta = 1, and, the activation function is, phi = 1/1 + e^-x
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
