Question: Need help on these questions please [17 pts] Consider a weighted multiple linear regression model with n observations and p predictors, Y = XB +

Need help on these questions please

Need help on these questions please [17 pts] Consider a weighted multiple

[17 pts] Consider a weighted multiple linear regression model with n observations and p predictors, Y = XB + e, where Y = (Y1, . .., Yn) is the n x 1 vector of responses; X is the n x (p + 1) matrix of predictors, including a column of 1's for the intercept; and e = (e1, . .., en) is the n x 1 vector of statistical errors. Here et's are independent normal random variables with mean 0 and variance o2 /wi. The values of {wi : i = 1, . .. , n} are known. Let Y = (Yl, ..., Yn) denote the fitted values from the WLS estimation and e = Y - Y denote the corresponding residuals. (Note: please clearly define new notations if needed, and show all the derivation steps for the following questions) (a) [4 pts] Derive the specific form of Var(e | X). (b) [4 pts] Derive the specific form of Cov(e, Y | X). (c) [4 pts] Conditional on X's, write the likelihood function of the observed responses, and derive the specific form of the maximum likelihood estimator of o2. Hint: recall that the probability density function of a Normal random variable Z with mean a and variance b2 is 1 p ( z) = V2762 exp (z - a)2 ] 262 (d) [5 pts] For the considered WLS model, specify how the ridge regression estimator of B is defined (suppose the intercept is not penalized). Derive the specific form of this ridge regression estimator (suppose both the responses and predictors are standardized)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!