Question: 4 . The training error of a decision tree with ONE leaf node is 1 0 / 4 0 and 4 0 is the total

4. The training error of a decision tree with ONE leaf node is 10/40 and 40 is the total number of training samples. What is the pessimistic error of this decision tree: ? After the splitting of the leaf node, there are four leaf nodes, and the training error becomes 9/40. Then what is the pessimistic error of the new decision tree: ? Should you prune the sub-tree after the splitting (yes/no): ?

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!