Question: The following table summarizes a data set with three attributes A, B. C and two class labels *, -. Builod a two-level decision tree Number

The following table summarizes a data set with three attributes A, B. C and two class labels *, -. Builod a two-level decision tree Number of A RInstances TTT 50 F TT 010 TF T 100 FFT 05 T T F010 FTF 25 0 T F F 100 FFF025 According to the classification error rate, which attribute would be chosen as the first splitting attribute? For each attribute, show the contingency table and the gains in classification error rate Repeat for the two children of the root node How many instances are misclassified by the resulting decision tree? Repeat parts (a), (b), and (c) using C as the splitting attribute Use the results in parts (c) and (d) to conclude about the greedy nature of the decision tree induction algorithm a. b. c. d. e
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
