Question: Decision Trees and Information Gain (15 marks = 3+12] Let's assume we have the following data regarding the attributes of some food items (ie., what
![Decision Trees and Information Gain (15 marks = 3+12] Let's assume](https://dsd5zvtm8ll6.cloudfront.net/si.experts.images/questions/2024/09/66f3d453ab0cd_63566f3d453388d4.jpg)
Decision Trees and Information Gain (15 marks = 3+12] Let's assume we have the following data regarding the attributes of some food items (ie., what is their weight, what is their length, and what is their color), and finally the labels indicate whether they were considered eatable or not (ie, the target output): Eatable? No Yes No Yes Yes Smell Normal Poor Poor Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal Poor Poor Normal Length Short Short Long Long Short Short Short Short Long Long Long Long Long Short Long Short Color Orange Purple Purple Orange Orange Orange Orange Purple Orange Orange Orange Orange Orange Orange Orange Orange Yes Yes No No Yes No No No Yes Yes Yes i) What is the value of entropy at the root of the decision tree: ii) On the basis of Information Gain, choose the best attribute for classifying only at the first level of the decision tree (at root, no need for the whole tree)
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
