Consider the training examples shown in Table 4.2 for a binary classification problem. (a) What is the

Question:

Consider the training examples shown in Table 4.2 for a binary classification problem.
(a) What is the entropy of this collection of training examples with respect to the positive class?
(b) What are the information gains of a1 and a2 relative to these training examples?
(c) For a3, which is a continuous attribute, compute the information gain for every possible split.
(d) What is the best split (among a1, a2, and a3) according to the information gain?
(e) What is the best split (between a1 and a2) according to the classification error rate?
(f) What is the best split (between a1 and a2) according to the Gini index?
Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Introduction to Data Mining

ISBN: 978-0321321367

1st edition

Authors: Pang Ning Tan, Michael Steinbach, Vipin Kumar

Question Posted: