Question: PLEASE HELP IN PYTHON Multilayer perceptrons are a type of artificial neural network ( ANN ) which use simple models to solve difficult computational tasks

PLEASE HELP IN PYTHON
Multilayer perceptrons are a type of artificial neural network (ANN) which use simple models to solve difficult computational tasks such as prediction. For example, if you use facial recognition to unlock your phone, you input the data (the image of your face) when setting up the program. Then, the phone makes a decision based on your input; it either recognizes your face and opens the application or it does not.
There are a multitude of free, open source tools to manage complex machine learning tasks and to acquire skills faster. These tools are called libraries. In this assignment, you use high-level mainstream libraries for AI to complete computations for machine learning. The code created in these libraries can be used as templates for future work problems that can be accessed by anyone using the library when making inferences from data, data mining, and data classification. All of the libraries are useful in real-world applications as they reduce the amount of coding work for programmers.
For this assignment, fill out the missing code and explain the code in the sections that require it below.
#1. BackPropagation:
#The following class (neuralnetwork) creates neural network objects that can be trained and tested using the back propagation algorithm. Be aware that this is a very inefficient code meant to show the details of the algorithm.
#Weights are not initialized completely randomly, reason why this class has a method called initialize_weights with two different options to initialize them. These methods however are beyond the scope of this course, so do not care much about understanding them.
#Also, there are multiple options for the transfer functions of neurons. By and large, we can select one type for neurons in all inner layers and one for neurons in the last or output layer. The options are: tansig, lineal, relu and sigmoid.
#The main method is train and that is precisely the one you are asked to contribute with. There are two lines of code missing (i.e. 'YOUR CODE TO FIND DELTA (Z) HERE!!!') and you have to complete them. They both have to do with the calculation of Z or Delta in the output/inner layer(s). Please complete! So the rest of the code can be run and the results resemble those here already given.
class neuralnetwork:
def __init__(self,P,T,layers, tol=0.001, alpha=0.01, ite=5000, in_fun='tangh',out_fun='lineal',weight_ini=0):
self.P=P
self.T=T
self.layers=layers
self.tol=tol
self.alpha=alpha
self.ite=ite
self.in_fun=in_fun
self.out_fun=out_fun
self.weight_ini=weight_ini
self.Ws=list()
self.Bs=list()
self.initialize_weights()
def initialize_weights(self):
size_entry=self.P.shape[0]
if self.weight_ini==0:
for i in range(len(self.layers)):#this for creates random weight matrixes for all layers
self.Ws.append(2*np.random.rand(self.layers[i],size_entry)-1)
self.Bs.append(2*np.random.rand(self.layers[i],1)-1) # bias
size_entry=self.Ws[i].shape[0]
elif self.weight_ini==1:
for i in range(len(self.layers)):#this for creates random weight matrixes for all layers
self.Ws.append((2*np.random.rand(self.layers[i],size_entry)-1)* np.sqrt(1/size_entry))
self.Bs.append(2*np.random.rand(self.layers[i],1)-1) # bias
size_entry=self.Ws[i].shape[0]
else:
print("Not recognized method for weights initialization")
def tansig(self, x, derivative=False):
arg=(np.exp(x)- np.exp(-x))/(np.exp(x)+ np.exp(-x))
return 1-np.square(arg) if derivative else arg#2/(1+np.exp(-2*x))-1
def lineal(self,x,derivative=False):
return 1 if derivative else x
def relu(self,x,derivative=False):
return (0+(x>0)) if derivative else np.maximum(np.zeros(x.shape),x)
def sigmoid(self,x, derivative=False):
arg=1/(1+np.exp(-x))
return np.multiply(arg,1- arg) if derivative else arg
def activation_fun(self,arg,fun,derivative):
if fun=='lineal':
net=self.lineal(arg,derivative) #lineal output
elif fun=='tangh' :
net=self.tansig(arg,derivative) #tan output
elif fun=='relu':
net=self.relu(arg,derivative)
elif fun=='sigmoid':
net=self.sigmoid(arg,derivative)
else:
print("Not recognized function", fun)
return net
def test(self):
As=[]# A is the output (a) of every layer
Zs=[]
As.append(self.P)
Zs.append('False')
for i in range(len(self.Ws)):
arg=self.Ws[i]*As[i]+self.Bs[i]
if i==len(self.Ws)-1: #if it is the output layer
net=self.activation_fun(arg,self.out_fun, derivative=False)
else:#is an inner layer
net=self.activation_fun(arg,self.in_fun, derivative=False)
As.append(net)
Zs.append(arg)
return As, Zs
def calculate
_mse(self,e):
error=0.5*((np.square(e).sum(axis=0)).sum(axis=1)/ e.shape[1])
return error
def train(self):
As, Zs=self.test()#try the net with the new random weights
e=self.T-A

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!