Question: Exercise 3 Now write a function that calculates the slope coefficients for the linear regression model. Using calculus, you can show that the following function

Exercise 3 Now write a function that calculates the slope coefficients for the linear regression model. Using calculus, you can show that the following function minimizes SSR(y, 1, Bo, B1) for B1: B 2-(yi - y); T) (; )2 which is called the ordinary least squares (OLS) estimator. Write a function that performs this calculation called ols,slope(y, x). Exercise 4 Now write a function ols intercept (y, x, beta-1 hat) that calculates the intercept co- efficient for the linear regression model. With the slope coefficient, the intercept can be calculated with Bo = y - BT. Exercise 5 Write a function ssr(y, x, beta.0, beta-1) that calculates the sum of squared residuals for the linear regression model. n i=1 SSR(y, x,Bo, B1) = (y - Bo B12:)? You can use the function ssr-loops (y, x, beta-o, beta-1) from Assignment 4 (including the solutions) as a template. Exercise 6 Now find values of beta_0 and beta_1 that minimize ssr(y, x, beta_0, beta_1) for given x and y. Write a function min_ssr (y, x, beta_0_min, beta_0_max, beta_1_min, beta_1_max, step) as follows: a) Find these values by evaluating ssr(y, x, beta_o, beta_1) over every combination of of (Bo, B.) in two lists. b) Create lists beta.o_list and beta 1.list from ranges Bo = fmin, ., Smak and B1 = Brin..., Bhat, where the neighboring values of Bo or B1 are separated by distance step. c) Start with min SSR = 999999. Loop over the index numbers i and j, corresponding to lists beta_0_list and beta 1.list. d) For each pair of i and j, extract the value beta_0_list[i] and beta 1.list[j]. e) For each pair of i and j, evaluate SSR(y, r, Bo, Bi). If it is lower than min SSR, record the new i min = i and j min = j and update the newest value of min.SSR. f) After the loops, return ( beta_0[i min), beta-1 [j_min] ] g) Verify that the result matches the values in Exercises 3 and 4 (up to accuracy step). Exercise 3 Now write a function that calculates the slope coefficients for the linear regression model. Using calculus, you can show that the following function minimizes SSR(y, 1, Bo, B1) for B1: B 2-(yi - y); T) (; )2 which is called the ordinary least squares (OLS) estimator. Write a function that performs this calculation called ols,slope(y, x). Exercise 4 Now write a function ols intercept (y, x, beta-1 hat) that calculates the intercept co- efficient for the linear regression model. With the slope coefficient, the intercept can be calculated with Bo = y - BT. Exercise 5 Write a function ssr(y, x, beta.0, beta-1) that calculates the sum of squared residuals for the linear regression model. n i=1 SSR(y, x,Bo, B1) = (y - Bo B12:)? You can use the function ssr-loops (y, x, beta-o, beta-1) from Assignment 4 (including the solutions) as a template. Exercise 6 Now find values of beta_0 and beta_1 that minimize ssr(y, x, beta_0, beta_1) for given x and y. Write a function min_ssr (y, x, beta_0_min, beta_0_max, beta_1_min, beta_1_max, step) as follows: a) Find these values by evaluating ssr(y, x, beta_o, beta_1) over every combination of of (Bo, B.) in two lists. b) Create lists beta.o_list and beta 1.list from ranges Bo = fmin, ., Smak and B1 = Brin..., Bhat, where the neighboring values of Bo or B1 are separated by distance step. c) Start with min SSR = 999999. Loop over the index numbers i and j, corresponding to lists beta_0_list and beta 1.list. d) For each pair of i and j, extract the value beta_0_list[i] and beta 1.list[j]. e) For each pair of i and j, evaluate SSR(y, r, Bo, Bi). If it is lower than min SSR, record the new i min = i and j min = j and update the newest value of min.SSR. f) After the loops, return ( beta_0[i min), beta-1 [j_min] ] g) Verify that the result matches the values in Exercises 3 and 4 (up to accuracy step)
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
