A sample (left(X_{i}, Y_{i} ight), i=1, ldots, n), is collected from a population with (E(Y mid X)=beta_{0}+beta_{1}
Question:
A sample \(\left(X_{i}, Y_{i}\right), i=1, \ldots, n\), is collected from a population with \(E(Y \mid X)=\beta_{0}+\beta_{1} X\) and used to compute the least squares estimators \(\hat{\beta}_{0}\) and \(\hat{\beta}_{1}\). You are interested in predicting the value of \(Y^{\text {oos }}\) from a randomly chosen out-of-sample observation with \(X^{o o s}=x^{o o s}\).
a. Suppose the out-of-sample observation is from the same population as the in-sample observations \(\left(X_{i}, Y_{i}\right)\) and is chosen independently of the in-sample observations.
i. Explain why \(E\left(Y^{o o s} \mid X^{o o s}=x^{o o s}\right)=\beta_{0}+\beta_{1} x^{o o s}\).
ii. Let \(\hat{Y}^{\text {oos }}=\hat{\beta}_{0}+\hat{\beta}_{1} x^{\text {oos }}\). Show that
\[ E\left(\hat{Y}^{o o s} \mid X^{o o s}=x^{o o s}\right)=\beta_{0}+\beta_{1} x^{o o s} \]
iii. Let \(u^{o o s}=Y^{o o s}-\left(\beta_{0}+\beta_{1} X^{o o s}\right)\) and \(\hat{u}^{o o s}=Y^{o o s}-\left(\hat{\beta}_{0}+\hat{\beta}_{1} X^{o o s}\right)\). Show that \(\operatorname{var}\left(\hat{u}^{\text {oos }}\right)=\operatorname{var}\left(u^{\text {oos }}\right)+\operatorname{var}\left(\hat{\beta}_{0}+\hat{\beta}_{1} X^{\text {oos }}\right)\).
b. Suppose the out-of-sample observation is drawn from a different population than the in-sample population and that the joint distributions of \(X\) and \(Y\) differ for the two populations. Continue to let \(\beta_{0}\) and \(\beta_{1}\) be the coefficients of the population regression line for the in-sample population.
i. Does \(E\left(Y^{o o s} \mid X^{o o s}=x^{o o s}\right)=\beta_{0}+\beta_{1} x^{o o s}\) ?
ii. Does \(E\left(\hat{Y}^{o o s} \mid X^{o o s}=x^{o o s}\right)=\beta_{0}+\beta_{1} x^{\text {oos }}\) ?
Step by Step Answer: