Question: Assignment 7 & 8 : Training MLP In this assignment, you are asked to train two different MLPs and applies to the created XOR datasets.
Assignment & : Training MLP In this assignment, you are asked to train two different MLPs and applies to the created XOR datasets. Additionally, you need to discuss your result. There are total tasks Answer the question in the designed place and run all the code. For coding parts, ll in your answer in: #Your code goes to here
For text answer, ll in your answer in: Your answer goes to here. Answer it in plain text Consider a XOR dataset as below import numpy as np #Create XOR dataset. A classical nonlinear seperatable dataset # Generate data for class A mean cov classdata nprandom.multivariatenormalmean cov mean cov classdata nprandom.multivariatenormalmean cov classdata npconcatenateclassdata classdata axis classlabels npzeros # Generate data for class B mean cov classdata nprandom.multivariatenormalmean cov mean cov classdata nprandom.multivariatenormalmean cov classdata npconcatenateclassdata classdata axis classlabels npones # Combine data and labels X npconcatenateclassdata, classdata axis y npconcatenateclasslabels, classlabels axis Visualize the data import matplotlib.pyplot as plt pltscatterclassdata:classdata: pltscatterclassdata:classdata: Start coding or generate with AI import torch ctensor torch.fromnumpyclassdatafloat ctensor torch.fromnumpyclassdatafloat Task : Training a MLP Model with dimensional hidden layer Recall the example shown in class can be described as following: Given data we doing the following process: transfer it into hidden feature pass through a sigmoid function transfer to convert to output via sigmoid. Now, in this task, you needs to build a MLP model that consists of two linear layer and given a data we need the following process: transfer it into hidden feature pass through a sigmoid function transfer to convert to output via sigmoid. x in R h in R in R h o in x in R h in R in R h o in import torch class MultilayerPerceptronQtorchnnModule: def initself: superMultilayerPerceptronQ selfinit #Your code goes to here def forwardself x: #Your code goes to here return None def negativelikeihoodmodel katydidtensor, grasshoppertensor: o modelkatydidtensor #confidence of a data belongs to the katydid likeihood torch.logo o modelgrasshoppertensor likeihood torch.logo overallloss likeihood.sum likeihoodsum return overallloss # randomly hold out samples to evaluate model indicesholdoutc torch.randpermlenctensor: indicesholdoutc torch.randpermlenctensor: ctensorholdout ctensorindicesholdoutc ctensorholdout ctensorindicesholdoutc # remove holdout samples from training data ctensor and ctensor vectorc torch.zeros for i in indicesholdoutc: vectorci ctensortrain ctensorvectorc vectorc torch.zeros for i in indicesholdoutc: vectorci ctensortrain ctensorvectorc # gradient descend import torch.optim as optim model MultilayerPerceptronQ op optim.SGDmodelparameterslr losslst lossholdoutlst nepoch for i in rangenepoch: # training loss negativelikeihoodmodel ctensortrain, ctensortrain #compute loss training data opzerograd #clean cache loss.backward #compute gradient opstep #gradient descend # vaildation: see the performance in unknown data unseen by model with torch.nograd: lossholdout negativelikeihoodmodel ctensorholdout, ctensorholdout #compute loss vaildationholdout data losslstappendlossitem lossholdoutlstappendlossholdout.item printloss printlossholdout Task : Write down your conclusion after visualize the loss change and the decision bound below: pltplotnparraylosslst: pltplotnparraylossholdoutlst: # prompt: draw decision boundary of logistic regression # Generate a grid of points for plotting the decision boundary xmin, xmax ymin, ymax xx yy npmeshgridnparangexmin, xmax, nparangeymin, ymax, # Create a tensor from the grid points gridtensor torch.tensornpcxxravel yyravel
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
