# Question

In the recursive construction of decision trees, it sometimes happens that a mixed set of positive and negative examples remains at a leaf node, even after all the attributes have been used. Suppose that we have p positive examples and r negative examples.

a. Show that the solution used by DECISION-TREE-LEARNING, which picks the majority classification, minimizes the absolute error over the set of examples at the leaf.

b. Show that the class probability p/ (p + n) minimizes the sum of squared errors.

a. Show that the solution used by DECISION-TREE-LEARNING, which picks the majority classification, minimizes the absolute error over the set of examples at the leaf.

b. Show that the class probability p/ (p + n) minimizes the sum of squared errors.

## Answer to relevant Questions

Suppose that a learning algorithm is trying to find a consistent hypothesis when the classifications of examples are actually random. There are u Boolean attributes, and examples are drawn uniformly from the set of 2n ...For each of the following determinations write down the logical representation and explain why the determination is true (if it is):a. Zip code determines the state (U.S.).b. Design and denomination determine the mass of a ...Suppose that Ann’s utilities for cherry and lime candies are c A and l A, whereas Bob’s utilities are c B and l B. (But once Ann has un-wrapped a piece of candy. Bob won’t buy it.) Presumably, if Bob likes lime candies ...Construct by hand a neural network that computes the XOR function of two inputs. Make sure to specify what sort of units you are using.The network in Figure has four hidden nodes. This number was chosen somewhat arbitrarily. Run systematic experiments to measure the learning curves for networks with different numbers of hidden nodes. What is the optimal ...Post your question

0