Question: ( 7 points ) In Lecture 1 2 , we viewed both the simple linear regression model and the multiple linear regression model through the
points In Lecture we viewed both the simple linear regression model and the
multiple linear regression model through the lens of linear algebra. The key geometric
insight was that if we train a model on some design matrix X and true response vector
Y our predicted response Y X is the vector in spanX that is closest to Y
In the simple linear regression case, our optimal vector is
and our design
matrix is
X
x
x
xn
n X:
This means we can write our predicted response vector as Y X
n X:
In this problem, n is the nvector of all s and X: refers to the nlength vector
x x xn
Note, X: is a feature, not an observation.
For this problem, assume we are working with the simple linear regression model,
though the properties we establish here hold for any linear regression model that contains
an intercept term.
a points Explain why Pn
i
ei using a geometric property. Hint: e Y Y
and e e e en
Think about how orthogonality applies here.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
