Question: 4. [17 pts] Consider a weighted multiple linear regression model with n observations and p predictors, Y = XB + e, where Y = (Yi....,

 4. [17 pts] Consider a weighted multiple linear regression model with
n observations and p predictors, Y = XB + e, where Y

4. [17 pts] Consider a weighted multiple linear regression model with n observations and p predictors, Y = XB + e, where Y = (Yi...., Y.) is the n x 1 vector of responses; X is the n x (p + 1) matrix of predictors, including a column of 1's for the intercept; and e = (c1, ....6.) is the n x 1 vector of statistical errors. Here er's are independent normal random variables with mean 0 and variance o'/wj. The values of fur : i = 1, ... .n) are known. Let Y = (Y1. ..., Y.) denote the fitted values from the WLS estimation and e = Y - Y denote the corresponding residuals. (Note: please clearly define new notations if needed, and show all the derivation steps for the following questions) (a) [4 pts] Derive the specific form of Var(e | X). (b) [4 pts) Derive the specific form of Cou(e, Y | X). (c) [4 pts] Conditional on X's, write the likelihood function of the observed responses, and derive the specific form of the maximum likelihood estimator of o?. Hint: recall that the probability density function of a Normal random variable Z with mean a and variance b' is P(=) = V2 TUz (d) [5 pts] For the considered WLS model, specify how the ridge regression estimator of B is defined (suppose the intercept is not penalized). Derive the specific form of this ridge regression estimator (suppose both the responses and predictors are standardized)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!