Question: First, please compute the: ( 1 ) entropy; ( 2 ) Gini; ( 3 ) classification error of cases ( a ) and ( b

First, please compute the: (1) entropy; (2) Gini; (3) classification error of cases (a) and (b). Assume they are data at a node of decision tree.
a. C0: 8 and C1: 12
b. C0: 6 and C1: 14
Second, assume we plan to split the data for case (a) and (b) in the follow ways. Could you calculate the information gain for each? Do you suggest a split based on these, and why?
a. C0:1 and C1:9+ C0: 7 and C1:3
b. C0:3 and C1:3+ C0: 3 and C1:11

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!