Question: Dimensionality Reduction Using PCA Bookmark this page Project due Jul 1 0 , 2 0 2 4 1 7 : 2 9 IST PCA finds

Dimensionality Reduction Using PCA
Bookmark this page
Project due Jul 10,202417:29 IST
PCA finds (orthogonal) directions of maximal variation in the data. In this problem we're going to project our
data onto the principal components and explore the effects on performance.
You will be working in the files part1/main. py and part1/features . py in this problem
Project onto Principal Components
3.03.0 points (graded)
Fill in function
in
features.py that implements PCA dimensionality reduction of dataset x.
Note that to project a given nd dataset x into its k-dimensional PCA representation, one can use matrix
multiplication, after first centering x :
widetilde(x)V
where widetilde(x) is the centered version of the original data x using the mean learned from training data and V is
the dk matrix whose columns are the top k eigenvectors of widetilde(x)Twidetilde(x). This is because the eigenvectors are
of unit-norm, so there is no need to divide by their length.
Function input:: You are given the full principal component matrix V' as pcs and the features mean
computed from the training data set as feature_means in this function. Note that pcs and feature_means
are learned from the training data set, which should not be computed in this function using x.
Available Functions: You have access to the NumPy python library as np.n=nthis function returns a new data array in which each sample in }nmncentered_}X=x\mathrm{- feature_means
 Dimensionality Reduction Using PCA Bookmark this page Project due Jul 10,202417:29

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!