Question: Consider a network with two inputs, x 1 and x 2 , two hidden neurons and one output neuron. The network has 9 total parameters:
Consider a network with two inputs, x and x two hidden neurons and one output neuron. The network has total parameters: I just want to know the FINAL weights and what they should be I have done it by hand and code. Please only the final weights.
w the weight connecting input to hidden neuron
w the weight connecting input to hidden neuron
w the bias for hidden neuron which has an implicit associated input of x
w the weight connecting input to hidden neuron
w the weight connecting input to hidden neuron
w the bias for hidden neuron which has an implicit associated input of x
w the weight connecting hiddenlayer neuron to the output neuron
w the weight connecting hiddenlayer neuron to the output neuron
w the bias for the output neuron, which has an implicit associated input of o
Diagram
Draw a diagram of network including all of the connections make it big and label everything with its appropriate weight.
ForwardCalculation
Let's start with the following weights:
w
w
w
w
w
w
w
w
w
All of the connections are and all of the biases are zero
We'll work on training the logicalAND function that we previously used for the single layer perceptron. Here is its truth table:
x x target label t
First, use the network as configured to attempt to classify each point. Perform a forwardpass through the network by hand for each of the four points. What results do you obtain?
for each hidden layer node j:
yj the weighted sum of inputs to node j
oj sigmoidyj
for the output layer node k:
yk the weighted sum of hidden layer oj values to node k
ok sigmoidyk
if ok :
assign class
else:
assign class
Train
We'll now train the network by hand using the backpropagation update rules. Im not repeating the update rules here, because they're hard to type, but you have them in your notes.
Iterate one point at a time. Use a learning rate of
Classify the point using the forward pass. In doing so you'll obtain values for each yj and oj at the hiddenlayer nodes, and yk and ok at the output layer.
Update the weights and the bias at the output node using its update rule.
Update the weights and bias at each of the two hiddenlayer nodes using their update rules.
The updates take place even if the assigned class matches the target class. Use the value of the output sigmoid function $ok$ to calculate the error
Do one epoch, training over all four points in the logicalAND function, updating the weights after each point.
After you update the weights, verify that they have moved in a direction that makes the classification more correct, even if the output has not crossed the threshold of to actually change its predicted class.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
