Question: manually run the linear regression process by computing the cost function and the gradient descent update for the weights for one iteration. Data set: [(1,1),

manually run the linear regression process by computing the cost function and the gradient descent update for the weights for one iteration.

Data set: [(1,1), (4,2)]

Initial weights: w_0 = 5, w_1 = -5

[Further explanation:

This means that in the first data point x=1, y =1, and in the second data point x=4 and y=2. So, you have two points on the plane.

Linear regression is designed to find a line of best fit (closest to all points as much as possible at the same time).

If you compare with the notebook from the linear regression lecture, you have n=1 (just one x) and m=2 (just two data points)

The line that you'll find is y = w_0 + w_1 x

]

Q1. Compute the cost at the current step. (Acceptable error: 0.1)

Apply gradient descent once to compute the updated values for w_0.

Data set: [(1,1), (4,2)]

Initial weights: w_0 = 5, w_1 = -5

Learning coefficient alpha = 0.1

Recall that

w_i = w_i - alpha*(dcost/dw_i)

Q3. Apply the gradient descent method once to compute the updated w_1.

Data set: [(1,1), (4,2)]

Initial weights: w_0 = 5, w_1 = -5

Learning coefficient alpha = 0.1

Q4. What will be the stable values to w_0 (up to two decimal places) if you let the algorithm run for many many iterations (say 10000 or more)?

Data set: [(1,1), (4,2)]

Initial weights: w_0 = 5, w_1 = -5

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!