Question: Information Theory Problem 1 Problem 1 We are provided with a set of training example for the unknown target function (Xi, X2) Y. Each row

 Information Theory Problem 1 Problem 1 We are provided with a

Information Theory Problem

1 Problem 1 We are provided with a set of training example for the unknown target function (Xi, X2) Y. Each row indicates the values observed and how many times that set of values was observed. For example, (+,T,T) was observed 3 times while (,T,T) was never observed. Count 1. Compute the sample entropy H(Y) for this training data? Assume the logarithm is 2. What is the mutual information between X and Y, /(Xi;Y), from the sample of 3, what is the mutual information between X2 and Y,(X2;Y), from the sample of 4. Draw the decision tree that would be learned from the sample training data? Hint: base 2 training data? training data? Think about which feature, Xi or X2, would be first, the one with the highest mutual information with Y

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!