Question: 4. [Decision Trees] (6 pts) In class, we covered how to learn binary decision trees. In this question, you will modify the decision tree

4. [Decision Trees] (6 pts) In class, we covered how to learn binary decision trees. In this question, you

4. [Decision Trees] (6 pts) In class, we covered how to learn binary decision trees. In this question, you will modify the decision tree learning algorithm to obtain ternary decision trees (i.e, each internal node has three children). Assume all features are real-valued, and the labels are categorical (i.e., classification). 2 a. (2 pts) Each internal node should split a single feature into three intervals x; t1, t > x; t2, and t > xj. Write down all parameters for this split. b. (2 pts) Write down the information gain for a single split. You do not have to expand the entropy (e.g., for traditional decision trees, the information gain is IG(j, t) = H (Z) - H(Z[x; t])P(x; t) - H(Z[x;

Step by Step Solution

3.49 Rating (162 Votes )

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock

a For a ternary decision tree split at an internal node for a feature xj the thre... View full answer

blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!