Question: 3) Decision Trees and Information Gain (15 marks = 3+12] Let's assume we have the following data regarding the attributes of some food items (i.e.,
![3) Decision Trees and Information Gain (15 marks = 3+12] Let's](https://dsd5zvtm8ll6.cloudfront.net/si.experts.images/questions/2024/09/66f3e65217f6a_24166f3e651a917c.jpg)
3) Decision Trees and Information Gain (15 marks = 3+12] Let's assume we have the following data regarding the attributes of some food items (i.e., what is their weight, what is their length, and what is their color), and finally the labels indicate whether they were considered eatable or not (i.e., the target output): Weight Normal Normal Abnormal Abnormal Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal Abnormal Abnormal Length Short Short Short Long Long Short Short Short Short Long Long Long Long Long Short Long Color Orange Orange Red Red Orange Orange Orange Orange Red Orange Orange Orange Orange Orange Orange Orange Eatable? Yes No Yes No Yes Yes Yes Yes No No Yes No No No Yes Yes i) What is the value of entropy at the root of the decision tree? ii) On the basis of Information Gain, choose the best attribute for classifying only at the first level of the decision tree (at root, no need for the whole tree)
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
