Question: Question 3 Decision Tree Classifier ? [ 4 5 % ] Consider the Data S e t 2 in Table 2 for a classification problem,

Question 3 Decision Tree Classifier ?[45%]
Consider the Data Set2 in Table 2 for a classification problem, where all attributes are nominal.
a)[24%] Use data records 1-12 as the training set to build a decision tree for predicting the class attribute. Use entropy as the impurity measure and the information gain 0.4 as the pre-pruning criteria. Show your steps.
b)[9%] Use data records 1-12 as the training set to complete the decision tree in Figure 1 by assigning a class label to each of the leaf nodes.
c)[12%] Use the data records 13-18 as the test set.
i. Construct the confusion matrix for the completed decision tree in part b).
ii. Calculate the corresponding accuracy.
iii. Calculate the precision and recall with respect to class "B".
\table[[,Record,LW,LD,RW,RD,Class],[Training,1,Low,Low,Low,Low,R],[set,2,Low,Low,Low,High,B],[,3,Low,Low,High,Low,R],[,4,Low,Low,High,Low,B],[,5,Low,High,Low,Low,B],[,6,Low,High,Low,High,L],[,7,Low,High,High,Low,R],[,8,High,Low,Low,High,L],[,9,High,Low,Low,High,B],[,10,High,Low,High,Low,R],[,11,High,High,Low,High,L],[,12,High,High,High,High,L],[Test set,13,Low,Low,Low,High,B],[,14,Low,High,Low,Low,L],[,15,Low,High,High,High,R],[,16,High,Low,Low,Low,B],[,17,High,Low,High,High,R],[,18,High,High,High,Low,L]]
Table 2 Dataset for classification
Question 3 Decision Tree Classifier ? [ 4 5 % ]

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!