Question: table [ [ table [ [ GLCM ] , [ Horizontal ] , [ Variance ] ] , table [ [ GLCM
tabletableGLCMHorizontalVariancetableGLCM VerticalVariancetableASMor Energyfor VerticalcIcMtableFruit QualityTargetHighHigh,High,High,tableTrainingDataHighLow,Low,MediumLowLow,Low,LowLowHigh,Low,LowHighLow,High,MediumHighLow,High,HighLowLow,Low,MediumLowLow,Low,MediumLowHigh,Low,Low,Test DataHighHigh,High,HighHighHigh,High,MediumLowHigh,Low,Medium
A If the decision tree is constructed using ID classifier with entropy based information gain as a criterion for attribute selection, Identify the attribute used at the root. Show all the calculations involved. Draw the resultant tree after the first iteration.
B Consider the decision trees given below in the table. Which of the trees is more suitable for the given data? Use precision of class "High" and Recall of class "Medium" as the decision criteria and explain with necessary computations.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
