Question: 1 . [ 3 pt . ] Consider the following data set of ten observations for classification of a binary categorical variable ( Y

1.[3 pt.] Consider the following data set of ten observations for classification of a binary categorical variable \( Y \) that has two classes (Yes and No), using two features. \( X_{1}\) takes three possible values: A, B, and C.\( X_{2}\) takes four possible values: 1st,2nd,3rd, and 4th.
Which feature (\( X_{1}\) or \( X_{2}\)) contains the information more closely relevant to \( Y \)? Answer it by calculating the following.
a) Entropy of each feature (\( X_{1}\) and \( X_{2}\))
b) Conditional entropy of each feature
c) Information gain of each feature
1 . [ 3 pt . ] Consider the following data set of

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!