Question: Please see attachment : 1. [Derivation/Conceptual] Consider the simple linear regression model Yi ~N(Bo + Bixi, 02), for i = 1, 2, ..., n where

Please see attachment :

Please see attachment : 1. [Derivation/Conceptual] Consider the simple linear regression model

1. [Derivation/Conceptual] Consider the simple linear regression model Yi ~N(Bo + Bixi, 02), for i = 1, 2, ..., n where all the Ya's are independent r.v.'s, and xi's are known constants. Recall the fitted values are defined as Mi = Bo + Bili, where Bo and B1 are the least squares estimates of the parameters based on the observed data (x1, y1), ..., (In, yn). (a) Show that El-1 (hi - y) (yi - Pi) = 0. [ Hint: One approach is to substitute the fitted values, Mi = Bo + Biti, then Bo = y - Bix, and simplify. ] (b) Using the result in (a), show that the following so-called "sum of squares decomposition" holds: n n n [(yi - y)2 = [(mi - y)2 + E(yi -Mi)2 i=1 i= 1 i=1 SS(Total) SS(Reg) SS( Res) Note that SS(Res) is the sum of squared residuals as defined in class, and we shall define SS(Total) and SS(Reg) as shown in this equation (we'll discuss the significance of this decomposition later). [ Hint: Add and subtract fi in the SS(Total) term. ] (c) The "sample variance of the responses y1, . .., Un" and "estimated error variance (62)" are two different quantities. Explain, in a couple sentences, how these two quantities are related to terms seen in the sum of squares decomposition in (b)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!