Question: Best Predictions Let X, Y be two random variables which follow some joint distribution. Suppose we want to predict Y after observing X and to
Best Predictions
Let X, Y be two random variables which follow some joint distribution. Suppose we want to predict Y after observing X and to keep things simple we want to use a linear function of X which can be denoted by a + bX. Let a^ , b^ be such that they minimize E(Y a bX)^2 over all a, b R. The prediction function a^ + (b^)X is called the best linear predictor of Y based on X.
(a) Give an expression for a^ , b^ in terms of means, variances and covariance of X, Y.
(b) Is it true that E(Y E(Y |X))^2 E(Y a bX)^2 ? Why or why not?
Now suppose X, Y are bivariate normal with 0 means, unit variances and correlation .
(c) Find the conditional expectation E(Y |X). (1 point) Hint: Find the conditional p.d.f of Y given X = x using the joint pdf of a bivariate normal.
(d) What is E(Y a^ b^X)^2 E(Y E(Y |X))^2 in this case?
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
