Question: table [ [ table [ [ GLCM ] , [ Horizontal ] , [ Variance ] ] , table [ [ GLCM

\table[[\table[[GLCM],[Horizontal],[Variance]],\table[[GLCM Vertical],[Variance]],\table[[ASM],[(or Energy)],[for VerticalcIcM]],\table[[Fruit Quality],[(Target)]],],[High,High,High,High,\table[[Training],[Data]]],[High,Low,Low,Medium],[Low,Low,Low,Low],[Low,High,Low,Low],[High,Low,High,Medium],[High,Low,High,High],[Low,Low,Low,Medium],[Low,Low,Low,Medium],[Low,High,Low,Low,Test Data],[High,High,High,High],[High,High,High,Medium],[Low,High,Low,Medium]]
A. If the decision tree is constructed using ID3 classifier with entropy based information gain as a criterion for attribute selection, Identify the attribute used at the root. Show all the calculations involved. Draw the resultant tree after the first iteration.
B. Consider the decision trees given below in the table. Which of the trees is more suitable for the given data? Use precision of class = "High" and Recall of class = "Medium" as the decision criteria and explain with necessary computations.
\ table [ [ \ table [ [ GLCM ] , [ Horizontal ] ,

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Civil Engineering Questions!