Question: Build a multi layer perceptron using back propogation Algorithm in pytorch. Given a fully connected Neural Network (MLP) as follows: Input [ , , ]:
Build a multi layer perceptron using back propogation Algorithm in pytorch.
Given a fully connected Neural Network (MLP) as follows:
- Input [ , , ]: nodes; 12
- -hidden fully connected layers (exclude output layer) with bias of 2 + 1 nodes;
- Output(Prediction): 1 node;
- Use the ReLU activation function for all hidden layers.
You are required to implement the following:
- Build this NN model using PyTorch Modules (backprop by autograd);
- Build this NN model from scratch. Both forward and backward functions should be implemented by yourself.
3. For both NN models: a) Generateinputdata [ , , ] in[0,1) drawn from a uniform distribution;
b) Generatelabel =( 1 1 + 2 2 + + )/;
c) Calculate MSE loss = (Prediction )2;
d) Use batch size of 1, i.e., do one-time forward propagation with one data point;
e) Use =10, =10.
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
