Question: PLEASE HELP IN PYTHON Multilayer perceptrons are a type of artificial neural network ( ANN ) which use simple models to solve difficult computational tasks
PLEASE HELP IN PYTHON
Multilayer perceptrons are a type of artificial neural network ANN which use simple models to solve difficult computational tasks such as prediction. For example, if you use facial recognition to unlock your phone, you input the data the image of your face when setting up the program. Then, the phone makes a decision based on your input; it either recognizes your face and opens the application or it does not.
There are a multitude of free, open source tools to manage complex machine learning tasks and to acquire skills faster. These tools are called libraries. In this assignment, you use highlevel mainstream libraries for AI to complete computations for machine learning. The code created in these libraries can be used as templates for future work problems that can be accessed by anyone using the library when making inferences from data, data mining, and data classification. All of the libraries are useful in realworld applications as they reduce the amount of coding work for programmers.
For this assignment, fill out the missing code and explain the code in the sections that require it below.
# BackPropagation:
#The following class neuralnetwork creates neural network objects that can be trained and tested using the back propagation algorithm. Be aware that this is a very inefficient code meant to show the details of the algorithm.
#Weights are not initialized completely randomly, reason why this class has a method called initializeweights with two different options to initialize them. These methods however are beyond the scope of this course, so do not care much about understanding them.
#Also, there are multiple options for the transfer functions of neurons. By and large, we can select one type for neurons in all inner layers and one for neurons in the last or output layer. The options are: tansig lineal, relu and sigmoid.
#The main method is train and that is precisely the one you are asked to contribute with. There are two lines of code missing ie 'YOUR CODE TO FIND DELTA Z HERE!!! and you have to complete them. They both have to do with the calculation of Z or Delta in the outputinner layers Please complete! So the rest of the code can be run and the results resemble those here already given.
class neuralnetwork:
def initselfPTlayers, tol alpha ite infun'tangh',outfun'lineal',weightini:
self.PP
self.TT
self.layerslayers
self.toltol
self.alphaalpha
self.iteite
self.infuninfun
self.outfunoutfun
self.weightiniweightini
self.Wslist
self.Bslist
self.initializeweights
def initializeweightsself:
sizeentryself.Pshape
if self.weightini:
for i in rangelenselflayers:#this for creates random weight matrixes for all layers
self.Wsappendnprandom.randselflayersisizeentry
self.Bsappendnprandom.randselflayersi # bias
sizeentryself.Wsishape
elif self.weightini:
for i in rangelenselflayers:#this for creates random weight matrixes for all layers
self.Wsappendnprandom.randselflayersisizeentry npsqrtsizeentry
self.Bsappendnprandom.randselflayersi # bias
sizeentryself.Wsishape
else:
printNot recognized method for weights initialization"
def tansigself x derivativeFalse:
argnpexpx npexpxnpexpx npexpx
return npsquarearg if derivative else arg#npexpx
def linealselfxderivativeFalse:
return if derivative else x
def reluselfxderivativeFalse:
return x if derivative else npmaximumnpzerosxshapex
def sigmoidselfx derivativeFalse:
argnpexpx
return npmultiplyarg arg if derivative else arg
def activationfunselfarg,fun,derivative:
if fun'lineal':
netself.linealargderivative #lineal output
elif fun'tangh' :
netself.tansigargderivative #tan output
elif fun'relu':
netself.reluargderivative
elif fun'sigmoid':
netself.sigmoidargderivative
else:
printNot recognized function", fun
return net
def testself:
As# A is the output a of every layer
Zs
AsappendselfP
ZsappendFalse
for i in rangelenselfWs:
argself.WsiAsiself.Bsi
if ilenselfWs: #if it is the output layer
netself.activationfunargself.outfun, derivativeFalse
else:#is an inner layer
netself.activationfunargself.infun, derivativeFalse
Asappendnet
Zsappendarg
return As Zs
def calculate
mseselfe:
errornpsquareesumaxissumaxis eshape
return error
def trainself:
As Zsself.test#try the net with the new random weights
eself.TA
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
