Need help modifying this code to implement the XOR function.
import math
import colorama
from colorama import Fore, Style
Backpropagation Learning Algorithm Homework with XOR function
Name: Emily Dogbatse
CMS: Artificial Intelligence
colorama.initautoresetTrue
# Steps to do
# activation function sigmoid DONE
# sigmoid derivative DONE
# forward propagation DONE
# Training function to calculate the error for the hidden layers DONE
def printheadermessage:
More cool ways to print for aesthetics
printForeGREEN Style.BRIGHT
printfmessage:
printForeGREEN Style.BRIGHT Style.RESETALL
def printsubheadermessage:
Even more cool ways to print for aesthetics
printForeYELLOW Style.BRIGHT
printfmessage:
printForeYELLOW Style.BRIGHT Style.RESETALL
def printProgressBariteration total, prefix suffix decimals length fill printendr:
This function just prints out a cool loading bar, it doesn't have anything to do
with the actual program itself.
percent : strdecimalsfformatiteration floattotal
filledlength intlength iteration total
bar fill filledlength length filledlength
printfrprefixbarpercentsuffix endprintend
# Print New Line on Complete
if iteration total:
print
def sigmoidx:
return math.expx
def sigmoidderivativex:
return x x
# Next steps : Forward propagation
# This step is basically adding the hidden layer
def forwardpropagationwhiddenlayer whiddenlayer woutput, xinput:
# hidden layer
hiddensum whiddenlayer whiddenlayer xinput whiddenlayer xinput
hiddenoutput sigmoidhiddensum
# hidden layer
hiddensum whiddenlayer whiddenlayer xinput whiddenlayer xinput
hiddenoutput sigmoidhiddensum
# output layer
outputsum woutput woutput hiddenoutput woutput hiddenoutput
output sigmoidoutputsum
# now return all of the outputs
return hiddenoutput, hiddenoutput, output
# what this returns is the output of the hidden layers and the output layer
# training function we did this by hand
def trainingwhiddenlayer whiddenlayer woutput, xinput, target, eta:
# forward pass
hiddenoutput, hiddenoutput, output forwardpropagationwhiddenlayer whiddenlayer woutput, xinput
# printing the results to make sure this works
printfInput: xinput Actual Output: output Target: target
# calculate the error
# we want to minimize this in order to classify correctly
outputerror target output
# error for hidden layer :
hiddenerror outputerror woutput sigmoidderivativehiddenoutput
# error for hidden layer :
hiddenerror outputerror woutput sigmoidderivativehiddenoutput
# NEXT STEP: UPDATE THE OUTPUT LAYER WEIGHTS
woutput eta outputerror
woutput eta outputerror hiddenoutput
woutput eta outputerror hiddenoutput
# update the hiddenlayer weights at each hidden neuron
whiddenlayer eta hiddenerror
whiddenlayer eta hiddenerror xinput
whiddenlayer eta hiddenerror xinput
whiddenlayer eta hiddenerror
whiddenlayer eta hiddenerror xinput
whiddenlayer eta hiddenerror xinput
return whiddenlayer whiddenlayer woutput
# train one full epoch
def epochXORwhidden whidden woutput, eta:
# XOR function this time
whidden whidden woutput trainingwhidden whidden woutput, eta
whidden whidden woutput trainingwhidden whidden woutput, eta
whidden whidden woutput trainingwhidden whidden woutput, eta
whidden whidden woutput trainingwhidden whidden woutput, eta
return whidden whidden woutput
# driver code As
def main:
printheaderBackpropagation Learning Algorithm for XOR"
# Initialize the weights
whidden # w w w
whidden
woutput
eta
trainingdata
# Introducing a stopping criterion
maxepochs
errorthreshold
printsubheaderTraining Neural Network..."
printProgress: end
Assume there is the rest of the main here