Question: Well use a simple example of training a neural network to function as an Exclusive OR (XOR). The truth table of an XOR is given
We’ll use a simple example of training a neural network to function as an “Exclusive OR” (“XOR”). The truth table of an XOR is given below.
| i1 | i2 | Output |
| 0 | 0 | 0 |
| 0 | 1 | 1 |
| 1 | 0 | 1 |
| 1 | 1 | 0 |
Let’s use a simple ANN to represent the XOR function. The ANN has three layers:
Input layer with two inputs neurons
One hidden layer with three neurons
Output layer with a single neuron
The initial weights are {0.8, 0.4, 0.3} from input neuron 1 to hidden neurons 1, 2, and 3. {0.2, 0.9, 0.5} from input neuron 2 to hidden neurons 1, 2, and 3. {0.3, 0.5, 0.9} from hidden neurons 1, 2, and 3 to the output neuron
To train our neural network, we need to find the weights that minimize prediction error. To answer this question, start with forward propagation, and then back propagation. Use the sigmoid function S(x) as an activation function. Show your steps for two rounds of output predictions and calculate the error in each case.
Marking Criteria
5 Marks for executing forward propagation correctly
5 Marks for executing back propagation correctly
5 Marks for adjusting the weights correctly
5 Marks for using the activation function correctly
Step by Step Solution
3.31 Rating (169 Votes )
There are 3 Steps involved in it
Alright Lets build an XOR neural network using forward and backward propagation Since we are using the sigmoid function as an activation function our ... View full answer
Get step-by-step solutions from verified subject matter experts
