Question: Suppose f1 is a model that optimally fits the data (X,y), and f2 is another model that optimally fits the data (X2,y) , where X2

Suppose f1 is a model that optimally fits the data (X,y), and f2 is another model that optimally fits the data (X2,y) , where X2 are the quadratic features of X . Then the loss function value obtained by f2 is always going to be at least equal to that for f1. Try to come up with a solid mathematical argument that justifies this claim.

CONTEXT: this is based in a question where I had to use a Logistic Regression Machine Learning Algorithm to a dataset and it's quadratic and cubic transformation and the loss function value decreases as d increases. However, for a large number of iterations this doesn't happen.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!