In the recursive construction of decision trees, it sometimes happens that a mixed set of positive and

Question:

In the recursive construction of decision trees, it sometimes happens that a mixed set of positive and negative examples remains at a leaf node, even after all the attributes have been used. Suppose that we have p positive examples and r negative examples.

a. Show that the solution used by DECISION-TREE-LEARNING, which picks the majority classification, minimizes the absolute error over the set of examples at the leaf.

b. Show that the class probability p/ (p + n) minimizes the sum of squared errors.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Artificial Intelligence A Modern Approach

ISBN: 978-0137903955

2nd Edition

Authors: Stuart J. Russell and Peter Norvig

Question Posted: