Question: In the recursive construction of decision trees it sometimes ha

In the recursive construction of decision trees, it sometimes happens that a mixed set of positive and negative examples remains at a leaf node, even after all the attributes have been used. Suppose that we have p positive examples and r negative examples.
a. Show that the solution used by DECISION-TREE-LEARNING, which picks the majority classification, minimizes the absolute error over the set of examples at the leaf.
b. Show that the class probability p/ (p + n) minimizes the sum of squared errors.

View Solution:


Sale on SolutionInn
Sales1
Views341
Comments
  • CreatedFebruary 14, 2011
  • Files Included
Post your question
5000