Question: Consider the following data set for a binary class problem. (a) Calculate the information gain when splitting on A and B. Which attribute would the
.png)
(a) Calculate the information gain when splitting on A and B. Which attribute would the decision tree induction algorithm choose?
(b) Calculate the gain in the Gini index when splitting on A and B. Which attribute would the decision tree induction algorithm choose?
(c) Figure 4.13 shows that entropy and the Gini index are both monotonously increasing on the range [0, 0.5] and they are both monotonously decreasing on the range [0.5, 1]. Is it possible that information gain and the gain in the Gini index favor different attributes? Explain.
BF T T F T F F F T F
Step by Step Solution
3.42 Rating (177 Votes )
There are 3 Steps involved in it
a The contingency tables after splitting on attributes A and B are The overall ... View full answer
Get step-by-step solutions from verified subject matter experts
Document Format (1 attachment)
908-M-S-D-A (8611).docx
120 KBs Word File
