Question: Decision Tree algorithm ( 4 points ) . a . Find entropy ( mathrm { H } ( mathrm { s }

Decision Tree algorithm (4 points).
a. Find entropy \(\mathrm{H}(\mathrm{s})\)(\(\mathbf{1}\) point) and information gain (\(\mathbf{1}\) point) for each attribute to find the best attribute to split on for the following dataset. The dataset has 14 samples with 4 attributes and 1 result (Class).
\begin{tabular}{|l|l|l|l|l|}
\hline Industry & Job Type & Income & \begin{tabular}{l}
Previous \\
Customer
\end{tabular} & Class \\
\hline Aerospace & Engineering & High & No & NO \\
\hline Aerospace & Engineering & High & Yes & NO \\
\hline Auto & Engineering & High & No & YES \\
\hline Electronics & Marketing & High & No & YES \\
\hline Urban & Marketing & Low & No & YES \\
\hline Urban & Marketing & Low & Yes & NO \\
\hline Auto & Marketing & Low & Yes & YES \\
\hline Aerospace & Sales & High & No & NO \\
\hline Aerospace & Marketing & Low & No & YES \\
\hline Electronics & Sales & Low & No & NO \\
\hline Aerospace & Sales & Low & Yes & YES \\
\hline Electronics & Sales & High & Yes & NO \\
\hline Auto & Engineering & Low & No & YES \\
\hline Electronics & Sales & High & Yes & NO \\
\hline
\end{tabular}
b. Draw the whole decision tree including entropy, information gain, and outcomes (Yes/No) for each node (6 points)
Decision Tree algorithm ( 4 points ) . a . Find

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!