Question: Suppose (X, Y ) is a random vector. Consider a linear prediction model Y = a + b X + , where = Y a

Suppose (X, Y ) is a random vector. Consider a linear prediction model Y = a + b X + , where = Y a b X is a prediction error and a, b are some constant coefficients. We minimize the expected squared predicition error with respect to parameters a and b, so we solve min a,b E h Y a b X 2 i (a) Write down the first order conditions (FOC) for this optimization problem. Hint: the first order conditions are two partial derivatives of E h Y a b X 2 i with respect to a and b. (b) Obtain optimal regression coefficients a and b solving the system of first order conditions. Hint: the optimal coefficients must be some functions of expectations, variances and covariance between X and Y . (c) Assume that (X, Y ) is a bivariate normal vector. Compare E(Y |X) with a +b X obtained in the previous questi

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!