Question: 2 . [ k Nearest Neighbors ] ( 1 0 pts ) Consider properties of ( k ) - NN models: a .

2.[k Nearest Neighbors](10 pts)
Consider properties of \( k \)-NN models:
a.(2 pts) Suppose that we are using \( k \)-NN with just two training points, which have different (binary) labels. Assuming we are using \( k=1\) and Euclidean distance, what is the decision boundary? Include a drawing with a brief explanation.
b.(2 pts) For binary classification, given infinite data points, can \( k \)-NN with \( k=1\) express any decision boundary? If yes, describe the (infinite) dataset you would use to realize a given classification decision boundary. If no, give an example of a decision boundary that cannot be achieved.
c.(2 pts) Suppose we take \( k \rightarrow \infty \); what type of function does the resulting model family become?
d.(2 pts) What effect does increasing the number of nearest neighbors \( k \) have on the bias-variance tradeoff? Explain your answer. [Hint: Use parts (b) and (c) in your explanation.]
e.(2 pts) In logistic regression, we learned that we can tune the threshold of the linear classifier to trade off the true negative rate and the true positive rate. Explain how we can do so for \( k \)-NNs for binary classification. [Hint: By default, \( k \)-NN uses majority vote to aggregate labels of the \( k \) nearest neighbors; consider another option.]
2 . [ k Nearest Neighbors ] ( 1 0 pts ) Consider

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!