Question: a . Please compute the ( i ) entropy; ( ii ) Gini; ( iii ) classification error of cases ( 1 ) and (
a Please compute the i entropy; ii Gini; iii classification error of cases and Assume they are data at a node of the decision tree. Note C and C are two classes, and the number after the colon indicates the number of samples in each class.
C: and C:
C: and C:
b Assume we plan to split the data for cases and in the following ways. Could you calculate the information gain for each? Do you suggest a split based on these, and why?
Hint: the definition of information gain can be found on page of Lecture
C: and C: C: and C:
C: and C: C: and C:
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
