Question: The following table summarizes a data set with three attributes A, B, C and two class labels +, . Build a two-level decision tree. (a)

The following table summarizes a data set with three attributes A, B, C and two class labels +, ˆ’. Build a two-level decision tree.
The following table summarizes a data set with three attributes

(a) According to the classification error rate, which attribute would be chosen as the first splitting attribute? For each attribute, show the contingency table and the gains in classification error rate.
(b) Repeat for the two children of the root node.
(c) How many instances are misclassified by the resulting decision tree?
(d) Repeat parts (a), (b), and (c) using C as the splitting attribute.
(e) Use the results in parts (c) and (d) to conclude about the greedy nature of the decision tree induction algorithm.

00050005 Nu In-+- 50000500 T T T T F F F F |T T F F T T F F |T F T F T F T F

Step by Step Solution

3.39 Rating (174 Votes )

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock

a The error rate for the data without partitioning on any attribute is After splitting on attribute ... View full answer

blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Document Format (1 attachment)

Word file Icon

908-M-S-D-A (8613).docx

120 KBs Word File

Students Have Also Explored These Related Statistics Questions!