Question: Quick but detailed solution with the tree please Question 2 B - Decision Tree Learning [ 2 5 % ] You have been given 5

Quick but detailed solution with the tree please
Question 2B - Decision Tree Learning [25%]
You have been given 5 training examples representing various cars and their attributes. The target attribute is 'Safety Rating,' which can have only two values: 'Good' or 'Poor.' This is to be predicted based on the other attributes of the car Type.
Here's a 5 training samples from the dataset:
\table[[Car Type,Engine Size,Airbags,Anti-lock Brakes,Safety Rating],[Sedan,Medium,Yes,Yes,Good],[SUV,Large,Yes,Yes,Good],[Hatchback,Small,No,No,Poor],[SUV,Medium,Yes,Yes,Good],[Sedan,Large,Yes,Yes,Poor]]
a) Calculate the entropy of the target attribute. Recall that
Entropy (S)=?defi=1cpilog2(1pi)=i=1c-pilog2(pi)
Where p1 is the proportion of S belonging to class i. Also note that log2(x)=log10(x)log10(2)
b) Show the value of the information gain for each candidate attribute at each step in the construction of the tree.
c) Construct the decision tree structure from the above examples, which would be learned by the ID3 algorithm.
In your report you need to include all details of the tree construction process following steps 1-3.
 Quick but detailed solution with the tree please Question 2B -

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!