Question: Regularization is added to gradient descent by incurring a cost from the solved weights during each iteration. Before you can implement gradient descent, you need

 Regularization is added to gradient descent by incurring a cost from

Regularization is added to gradient descent by incurring a cost from the solved weights during each iteration. Before you can implement gradient descent, you need a way to track your performance, which computes J(). This cost was computed via a function called computeCost.m in hwk2. Copy computeCost.m from hwk2 to a m-file called computeCostReg.m. Update the cost function in computeCostReg.m from: J()=2n1[i=1n(h(,x(i))y(i))2] To: J()=2n1[i=1n(h(,x(i))y(i))2+j=1Dj2] Note we do not penalize our 0 weight. Test your code with the following matlab segment. Do not continue until your answer is: 2.1740e+09. Then, show your computeCostReg.m clear ; close all; \% Load Data data =load(.. hwk2lex1data2.txt'); X=data(:,1:2); y=data(:,3); % Scale features and set them to zero mean with std =1 [ Xnorm mu sigma ]= featureNormalize (X);% reuse this function from hwk2 % Add intercept term to X Xdata =[ ones ( length (X),1) Xnorm ] % Init Theta and lambda theta =((XdataXdata)\Xdata)y; \%well..this is the optimal solution lambda =1; \%Run Compute Cost disp(computeCostReg(Xdata,y,theta, lambda)) Ans (show computeCostReg.m)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!