Question: We can evaluate our model even further by running the command table(round(predict(model, train.data, type=response)), train.data$chf), this command will provide us with a confusion matrix shown

We can evaluate our model even further by running the command table(round(predict(model, train.data, type="response")), train.data$chf), this command will provide us with a confusion matrix shown in figure 5 below. A confusion matrix shows how many records in the data have predicted classification. In figure 5 diagonally from left to right shows the number of correct classification instances, 211 + 47=258. Diagonally from right to leftshows the number of incorrect classification instances, 62+32=94. The combination of the two totals equals the number of instances in the in the training set, 352. With this information we can obtain the classification accuracy of the data, the diagonal sum 258 divided by the number of instances in the data 352, 258/352=0.73295 or 73

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!