Question: n this programming assignment, please write codes from scratch ( do not use preexisting libraries from anywhere ) for ( 1 ) linear models and
n this programming assignment, please write codes from scratch do not use preexisting libraries from
anywhere for linear models and gradient descent algorithm.
Task Classifying MNIST data
Use the MNIST data provided along with this assignment. There are two files: MNISTtrainingHWcsv
and MNISTtestHWcsv In those two datasets, there are and samples respectively for each
label of and consequently, we will solve a binary classification problem.
You will train a linear regression model using the training data MNISTtrainingHWcsv and will
compute accuracy with the test data MNISTtestHWcsv
For Task please follow the procedure:
Train a linear regression model
Find the optimal parameters theta by the Normal Equation: theta X T X XT y It will cause an
errorwarning
You can use numpylinalg.pinv or pinv from MATLAB for matrix inverse. The function
computes MoorePenrose pseudoinverse X which will further be used to find valid optimal
parameters using theta X y This will be a solution to rankdeficient degenerate system.
Display the optimal coefficients denoted by theta
Classify binary test data MNISTtest.csv with a threshold of as described below:
y pred X test theta
if y pred class otherwise
Display the accuracy.
Task Implementation of Gradient Descent with MNIST data
For Task we will use the same data as Task However, we will find the optimal coefficients by using
"Gradient Descent" algorithm. Then, we will compare with the solution that we found in Task
The procedure of Task is almost the same as Task but need to implement "Gradient Descent"
algorithm, instead of a leastsquare minimization based solution of single line equation as X T X X T y or
more appropriately P inv
For the Gradient Descent algorithm, please follow the procedure:
Set the initial coefficient to zeros can be any random values though
Think of what the dimension of the coefficient vector is
Determine hyperparameters such as learning rate alpha and iteration numbers k
Run "gradient descent" algorithm with the hyperparameters and check "Learning Curve" as
shown:
Learning curve shows whether it converges or not. Xaxis shows the number of iterations,
while yaxis shows cost J
Learning curve must be showing as "converged", otherwise the solution may not be good.
Display the estimated coefficients denoted by theta
Classify test data MNISTtest.csv with a threshold of as described below:
ypred Xtest theta
if ypred class otherwise
Display the accuracy.
Display the aggregate difference between theta and theta as defined below:
Submission:
Please submit the following to the DL by the stipulated deadline therein:
Summary: MS Word or PDF file summarizing your results, and discussion only.
Describe what you did, and how, as well as your important results.
Code: one PythonMATLABOctaveRJavaC original executable source code file, or a zipfile if
multiple code files with clear running ReadMe file.
Must be well organized comments indentation,
Code PDF: also, upload the PDF version of the original source code files in case we would just like
to have a look at the code but dont want to run it
Please submit these THREE files SEPERATELY. DO NOT compress into a ZIP file.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
