Question: Hi, I have the problem with the same question. At first, I got the answer from Chegg here. After my two times submission I still
Hi, I have the problem with the same question. At first, I got the answer from Chegg here. After my two times submission I still got the same error.
Please take note on the DECISIONTREE CLASSIFIER MAX_DEPTH PART. I have change the max_depth=None(first time submission) to max_depth=2(second time submission), but still got the same error. Please help me on the coding. Thank you.
Question and the answer:
Part B[5 points] : Build a decision tree classifier using the sklearn toolbox. Then compute metrics for performance like precision and recall. This is a binary classification problem, therefore we can label all points as either positive (SPAM) or negative (NOT SPAM).
def build_dt(data_X, data_y, max_depth = None, max_leaf_nodes =None): ''' This function builds the decision tree classifier and fits it to the provided data. Arguments data_X - a np.ndarray data_y - np.ndarray max_depth - None if unrestricted, otherwise an integer for the maximum depth the tree can reach. Returns: A trained DecisionTreeClassifier ''' # your code here from sklearn.tree import DecisionTreeClassifier from sklearn.metrics import precision_recall_fscore_support clf = DecisionTreeClassifier(max_depth=2) clf.fit(data_X, data_y) # Get the metrics precision, recall, fscore, support = precision_recall_fscore_support(data_y, clf.predict(data_X)) # Print the results print("Precision:", precision) print("Recall:", recall) print("F-score:", fscore) print("Support:", support) return clf The Error after 2 times submission:
AssertionError Traceback (most recent call last)in Traceback Redacted AssertionError: Look at Problem 2, part B. Is your decision tree the proper depth?
Please help me on the coding. Thank you so much.
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
