Question: 188 Chapter 3 X Y Z 0 0 0 Classification No. of Chase CL Examples No. of Class C2 x Too ponch described cum for

 188 Chapter 3 X Y Z 0 0 0 Classification No.

188 Chapter 3 X Y Z 0 0 0 Classification No. of Chase CL Examples No. of Class C2 x Too ponch described cum for split and then choo (n) Computea two-level declaion tree using the greedy approach der this chapter. Use the classification error rate as the criterion for What is the overall error rate of the induced tree? (b) Repeat part (a) using X as the first splitting attribute and then the best remaining attribute for splitting at each of the two su nodes. What is the error rate of the induced free? (c) Compare the results of parts a) and (h). Comment on the suitabili the greedy heuristic used for splitting attribute selection. 8. The following table summarizus a data set with three attributes 4. RC two class labels + - Build two-level decision tree. Number of Instances FEE HEHE HEHE ITF FFO | F | F | | F (a) According to the classification error rate, which attribute would be chosen as the first splitting attribute? For each attribute, show the contingency table and the gains in classification error rate. (b) Repeat for the two children of the root node. (c) How many instances are misclassified by the resulting decision tree? (d) Repeat parts (a), (b), and (c) using C as the splitting attribute. (e) Use the results in parts (c) and (d) to conclude about the greedy tatime of the decision tree induction algorithm. 188 Chapter 3 X Y Z 0 0 0 Classification No. of Chase CL Examples No. of Class C2 x Too ponch described cum for split and then choo (n) Computea two-level declaion tree using the greedy approach der this chapter. Use the classification error rate as the criterion for What is the overall error rate of the induced tree? (b) Repeat part (a) using X as the first splitting attribute and then the best remaining attribute for splitting at each of the two su nodes. What is the error rate of the induced free? (c) Compare the results of parts a) and (h). Comment on the suitabili the greedy heuristic used for splitting attribute selection. 8. The following table summarizus a data set with three attributes 4. RC two class labels + - Build two-level decision tree. Number of Instances FEE HEHE HEHE ITF FFO | F | F | | F (a) According to the classification error rate, which attribute would be chosen as the first splitting attribute? For each attribute, show the contingency table and the gains in classification error rate. (b) Repeat for the two children of the root node. (c) How many instances are misclassified by the resulting decision tree? (d) Repeat parts (a), (b), and (c) using C as the splitting attribute. (e) Use the results in parts (c) and (d) to conclude about the greedy tatime of the decision tree induction algorithm

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!