Question: Can I get help with part a and c please? Thanks in advance 1. [Derivation/Conceptual] Consider the simple linear regression model Yi N(Bo + Biri,
Can I get help with part a and c please? Thanks in advance

1. [Derivation/Conceptual] Consider the simple linear regression model Yi N(Bo + Biri, 02), for i = 1, 2, ..., n where all the Ya's are independent r.v.'s, and ry's are known constants. Recall the fitted values are defined as / = Bo + Bit;, where Bo and 8, are the least squares estimates of the parameters based on the observed data (21, y1), ..., (In, yn). (a) Show that Et ,(ji - y) (yi - pi) = 0. [ Hint: One approach is to substitute the fitted values, pi = Bo + Biri, then Bo = y - Bla, and simplify. ] (b) Using the result in (a), show that the following so-called "sum of squares decomposition" holds: n n n (yi - y)2 = >( ii - 1)2 + > (yi - ji)? SS(Total) SS(Reg) SS(Res) Note that SS(Res) is the sum of squared residuals as defined in class, and we shall define SS(Total) and SS(Reg) as shown in this equation (we'll discuss the significance of this decomposition later). [ Hint: Add and subtract pi in the SS(Total) term. ] (c) The "sample variance of the responses y1, ..., Un" and "estimated error variance (62)" are two different quantities. Explain, in a couple sentences, how these two quantities are related to terms seen in the sum of squares decomposition in (b)
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
