Question: Build a multi layer perceptron using back propogation Algorithm in pytorch. Given a fully connected Neural Network (MLP) as follows: Input [ , , ]:

Build a multi layer perceptron using back propogation Algorithm in pytorch.

Given a fully connected Neural Network (MLP) as follows:

  1. Input [ , , ]: nodes; 12
  2. -hidden fully connected layers (exclude output layer) with bias of 2 + 1 nodes;
  3. Output(Prediction): 1 node;
  4. Use the ReLU activation function for all hidden layers.

You are required to implement the following:

  1. Build this NN model using PyTorch Modules (backprop by autograd);
  2. Build this NN model from scratch. Both forward and backward functions should be implemented by yourself.

3. For both NN models: a) Generateinputdata [ , , ] in[0,1) drawn from a uniform distribution;

b) Generatelabel =( 1 1 + 2 2 + + )/;

c) Calculate MSE loss = (Prediction )2;

d) Use batch size of 1, i.e., do one-time forward propagation with one data point;

e) Use =10, =10.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!