Question: Suppose the data set has X = [X, X2.....XN] and y (C, C). This problem task is to show that the Fisher criterion J(w)
Suppose the data set has X = [X, X2.....XN] and y (C, C). This problem task is to show that the Fisher criterion J(w) can be considered as a special case of least squares. a. [20 pts, paper] The sum squared error function is E=E1 (wx, +wo-y). Show that the gradient of E can be analytically expressed as following (Sw +2SB)w = N(m - m). N N where N is the number of X in class C and N is the total number of observations. hint: the total sum of target is N = N i=1 N and the mean of the total data set is m = (Nm + Nm). (1) b. [2 pts] Use the following code to generate the train data set. It will generate a random data set with four features and two classes. from sklearn import datasets X, y = datasets.make_blobs (n_samples = 200, n_features= 4, centers = 2, cluster_std= 2, random_state= 100) c. [3 pts] Using Scikit-learn LDA, fit the train data and determine the weight vector w. d. [15 pts] Note that Saw in Equation (1) is always in the direction of (m-m). Compute the w vector from Equation 1. Compare the direction of w vectors from 1.c. For this problem, use NumPy only.
Step by Step Solution
3.44 Rating (151 Votes )
There are 3 Steps involved in it
Certainly Heres a stepbystep breakdown of the problem Part a Fisher Criterion as a Special Case of Least Squares The Fisher criterion Jw aims to maxim... View full answer
Get step-by-step solutions from verified subject matter experts
