Question: 2 . [ k Nearest Neighbors ] ( 1 0 pts ) Consider properties of ( k ) - NN models: a .
k Nearest Neighbors pts
Consider properties of k NN models:
a pts Suppose that we are using k NN with just two training points, which have different binary labels. Assuming we are using k and Euclidean distance, what is the decision boundary? Include a drawing with a brief explanation.
b pts For binary classification, given infinite data points, can k NN with k express any decision boundary? If yes, describe the infinite dataset you would use to realize a given classification decision boundary. If no give an example of a decision boundary that cannot be achieved.
c pts Suppose we take k rightarrow infty ; what type of function does the resulting model family become?
d pts What effect does increasing the number of nearest neighbors k have on the biasvariance tradeoff? Explain your answer. Hint: Use parts b and c in your explanation.
e pts In logistic regression, we learned that we can tune the threshold of the linear classifier to trade off the true negative rate and the true positive rate. Explain how we can do so for k NNs for binary classification. Hint: By default, k NN uses majority vote to aggregate labels of the k nearest neighbors; consider another option.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
