Question: Background So you have seen an overfitting problem in the previous task. Now it is time to fix it. Recall that our first regularization treatment

 Background So you have seen an overfitting problem in the previous

Background So you have seen an overfitting problem in the previous task. Now it is time to fix it. Recall that our first regularization treatment is to add a penalization term to the loss function to prevent the coefficient from going too high. The model with squared regularization term is called ridge regression, where ridge refers to the shape of the loss in high-dimensional space. Requirement 1.Check the ridge regression model(https://scikit-learn.org/stable/modules/generated/sklearn_linear_model.Ridge.html)_make sure you know how it runs. 2. Build the ridge model with different degrees of regularization.(alpha=1e-3,1e-2,1e-1) and plot the model output. Task5. Bayesian Curve-fitting Background Now, let us try to use the bayesian method to capture the uncertainty that exists in the data. The uncertainty is indeed the random Gaussian noise we introduced in the data generation function. It is called epistemic uncertainty. Requirement 1.Check the bayesian regression here (https://scikit-learn.org/stable/modules/linear_model.html\#bayesian-regression) . 2.Fit the model with the same polynomial data and plot the model prediction similar to figure 1.17. Now you can see your prediction captures uncertainty as well

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!