Question: Consider this scenario: the loss function during a training process keeps decreasing for the training set, but it doesn't decrease at all for the testing
Consider this scenario: the loss function during a training process keeps decreasing for the training set, but it doesn't decrease at all for the testing set. Any guess why?
Overfitting
Underfitting
the training set is not a good representative of the whole data-set
The selected algorithm is not working properly
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
