Question: a . Please compute the ( i ) entropy; ( ii ) Gini; ( iii ) classification error of cases ( 1 ) and (

a. Please compute the (i) entropy; (ii) Gini; (iii) classification error of cases (1) and (2). Assume they are data at a node of the decision tree. Note C0 and C1 are two classes, and the number after the colon indicates the number of samples in each class.
(1) C0: 8 and C1: 12
(2) C0: 6 and C1: 14
b. Assume we plan to split the data for cases (1) and (2) in the following ways. Could you calculate the information gain for each? Do you suggest a split based on these, and why?
Hint: the definition of information gain can be found on page 45 of Lecture-6.
(1) C0: 1 and C1: 9+ C0: 7 and C1: 3
(2) C0: 3 and C1: 3+ C0: 3 and C1: 11

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!