Question: 1. [Linear Algebra] Consider 2-dimensional data points of [2, 1], [1, 0], [2, 2], [3, 2], [0, 0], [0, 2], [-1, 0]. (a) All the
![1. [Linear Algebra] Consider 2-dimensional data points of [2, 1], [1,](https://dsd5zvtm8ll6.cloudfront.net/si.experts.images/questions/2024/10/671b982e263b1_666671b98226bfc9.jpg)
![0], [2, 2], [3, 2], [0, 0], [0, 2], [-1, 0]. (a)](https://dsd5zvtm8ll6.cloudfront.net/si.experts.images/questions/2024/10/671b9830bdd5d_680671b98308b0e4.jpg)
1. [Linear Algebra] Consider 2-dimensional data points of [2, 1], [1, 0], [2, 2], [3, 2], [0, 0], [0, 2], [-1, 0]. (a) All the data points can be gathered together and shown with one matrix. Let's assume [X]2x7 is that matrix. Fill the following matrix. X = 2x7 (b) What is the point showing the center of these points? Hint: Calculate the mean of the values in each dimension]. 2x1(c) Calculate Y = (X - #)(X -#) in which XT is the transpose of X. To calculate (X - /), easily subtract the / from all the data points. 12 5 Y = 2x2 (d) Solve |Y - A/| = 0 to extract the values of A. | . | is the determinant and I is the identity matrix. A values are called eigenvalues. A = (e) Calculate the corresponding eigenvector to the largest eigenvalue. 1 = 2x1 (f) Compute X = vX. X = 1x7 Congratulations you performed Principle Component Analysis (PCA) procedure, a well known di- mensionality reduction method in machine learning. In other words, you projected your 2-dimesional data into 1-dimensional one such that you preserve the variance as much as possible (i.e. the least information has been lost)
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
