Question: Consider a network with two inputs, x 1 and x 2 , two hidden neurons and one output neuron. The network has 9 total parameters:

Consider a network with two inputs, x1 and x2, two hidden neurons and one output neuron. The network has 9 total parameters: I just want to know the FINAL weights and what they should be. I have done it by hand and code. Please only the final weights.
w11, the weight connecting input 1 to hidden neuron 1
w12, the weight connecting input 2 to hidden neuron 1
w10, the bias for hidden neuron 1, which has an implicit associated input of x0=1
w21, the weight connecting input 1 to hidden neuron 2
w22, the weight connecting input 2 to hidden neuron 2
w20, the bias for hidden neuron 2, which has an implicit associated input of x0=1
w31, the weight connecting hidden-layer neuron 1 to the output neuron
w32, the weight connecting hidden-layer neuron 2 to the output neuron
w30, the bias for the output neuron, which has an implicit associated input of o0=1
Diagram
Draw a diagram of network including all of the connections (make it big), and label everything with its appropriate weight.
Forward-Calculation
Let's start with the following weights:
w11=1
w12=1
w10=0
w21=1
w22=1
w20=0
w31=1
w32=1
w30=0
(All of the connections are 1 and all of the biases are zero).
We'll work on training the logical-AND function that we previously used for the single layer perceptron. Here is its truth table:
x_1 x_2| target label (t)
------------------------------
000
010
100
111
First, use the network as configured to attempt to classify each point. Perform a forward-pass through the network by hand for each of the four points. What results do you obtain?
for each hidden layer node j:
y_j = the weighted sum of inputs to node j
o_j = sigmoid(y_j)
for the output layer node k:
y_k = the weighted sum of hidden layer o_j values to node k
o_k = sigmoid(y_k)
if o_k <=.5:
assign class 0
else:
assign class 1
Train
We'll now train the network by hand using the backpropagation update rules. I'm not repeating the update rules here, because they're hard to type, but you have them in your notes.
Iterate one point at a time. Use a learning rate of .1.
Classify the point using the forward pass. In doing so, you'll obtain values for each yj and oj at the hidden-layer nodes, and yk and ok at the output layer.
Update the weights (and the bias) at the output node using its update rule.
Update the weights and bias at each of the two hidden-layer nodes using their update rules.
The updates take place even if the assigned class matches the target class. Use the value of the output sigmoid function $ok$ to calculate the error
Do one epoch, training over all four points in the logical-AND function, updating the weights after each point.
After you update the weights, verify that they have moved in a direction that makes the classification more correct, even if the output has not crossed the threshold of .5 to actually change its predicted class.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Accounting Questions!