why does increasing the regularization parameter (lambda) may increase or decrease the training error but it always
Fantastic news! We've Found the answer you've been seeking!
Question:
why does increasing the regularization parameter (lambda) may increase or decrease the training error but it always increases the test error?
while I was calculating the test and train error with different lambda values in the regularization equation. the training error kept decreasing with increasing lambda until a certain value it increased again. while in the test error the error kept increasing by increasing lambda. I want to have a conceptional meaning for those results.
Related Book For
Posted Date: