Question: Quality illustration and workings only Exercise 2.15 Consider the intercept-only model Y = a + e with a the best linear predictor. Show that a

Quality illustration and workings only

Quality illustration and workings only Exercise
Exercise 2.15 Consider the intercept-only model Y = a + e with a the best linear predictor. Show that a = E[Y]. Exercise 2.16 Let X and Y have the joint density f (x, y)= (x+ y') on Os x= 1, 0sys1. Compute the coefficients of the best linear predictor Y = a + X + e. Compute the conditional expectation m(x) = E[Y | X = x] . Are the best linear predictor and conditional expectation different? Exercise 2.17 Let X be a random variable with / = E[X'] and o? = var[X']. Define 8 (x,4, 0 7)= (x-14)2-02) Show that E [g (X, m, s) ] =0 if and only if m = u and s =02. Exercise 2.18 Suppose that X = (1, X2, X3) where X3 = 01 + 02 X2 is a linear function of X2. (a) Show that Qxx = E [XX'] is not invertible. (b) Use a linear transformation of X to find an expression for the best linear predictor of Y given X. (Be explicit, do not just use the generalized inverse formula.) Exercise 2.19 Show (2.47)-(2.48), namely that for d(B) = E (m(x) - x's)?] then B = argmind(b) = (E[XX']) 'E[Xm(X)] = (E[XX']] ' E[XY]. beRet Hint: To show E[Xm(X)] =E[XY] use the law of iterated expectations. Exercise 2.20 Verify that (2.57) holds with m(X) defined in (2.6) when (Y, X) have a joint density f(y, x)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!