Question: This problem considers the estimation of parameters in linear models. Consider the linear model X X12 X22 06-30-0 which we may write more briefly

This problem considers the estimation of parameters in linear models. Consider the

This problem considers the estimation of parameters in linear models. Consider the linear model X X12 X22 06-30-0 which we may write more briefly as Y = XB+. The elements of the X matrix are pot random, 3 is a vector of unknown parameters, and is a vector of random errors. (a) Suppose the elements of e have mean zero, are uncorrelated, and Var(e) = for all i = 1,..., p. Then the mean vector and covariance matrix for Y are E(Y) = Var(Y) = (b) The ordinary least squares estimator for B is b= (XX)-XY. The mean vector and covariance matrix for bare E(b) Var(b) = (c) Suppose the elements of e are correlated and variance are not all the same, i.e., Var(t) = I, but E() = 0. Then the mean vector and covariance matrix for the ordinary least squares estimator bare E(b) Var(b) = (d) A generalized least squares estimator for @ is b = (XE-X)-XE-Y. Using the assumptions in part (e), the mean vector and covariance matrix for bare E(b) = Var(b') = Both b and b' are unbiased estimators. Using the extended Cauchy-Schwartz In- equality to show that Var(a'b) Var(a'b') for any vector of constants a. Extended Cauchy-Schwarz Inequality: Let box and d be any two vectors and let Bax, be a positive definite (21) matrix. Then (b'd) b'Bb) (dB-d) with equality if and only if b=cB-d(or d=cBb) for some constant e. This problem considers the estimation of parameters in linear models. Consider the linear model X X12 X22 06-30-0 which we may write more briefly as Y = XB+. The elements of the X matrix are pot random, 3 is a vector of unknown parameters, and is a vector of random errors. (a) Suppose the elements of e have mean zero, are uncorrelated, and Var(e) = for all i = 1,..., p. Then the mean vector and covariance matrix for Y are E(Y) = Var(Y) = (b) The ordinary least squares estimator for B is b= (XX)-XY. The mean vector and covariance matrix for bare E(b) Var(b) = (c) Suppose the elements of e are correlated and variance are not all the same, i.e., Var(t) = I, but E() = 0. Then the mean vector and covariance matrix for the ordinary least squares estimator bare E(b) Var(b) = (d) A generalized least squares estimator for @ is b = (XE-X)-XE-Y. Using the assumptions in part (e), the mean vector and covariance matrix for bare E(b) = Var(b') = Both b and b' are unbiased estimators. Using the extended Cauchy-Schwartz In- equality to show that Var(a'b) Var(a'b') for any vector of constants a. Extended Cauchy-Schwarz Inequality: Let box and d be any two vectors and let Bax, be a positive definite (21) matrix. Then (b'd) b'Bb) (dB-d) with equality if and only if b=cB-d(or d=cBb) for some constant e. This problem considers the estimation of parameters in linear models. Consider the linear model X X12 X22 06-30-0 which we may write more briefly as Y = XB+. The elements of the X matrix are pot random, 3 is a vector of unknown parameters, and is a vector of random errors. (a) Suppose the elements of e have mean zero, are uncorrelated, and Var(e) = for all i = 1,..., p. Then the mean vector and covariance matrix for Y are E(Y) = Var(Y) = (b) The ordinary least squares estimator for B is b= (XX)-XY. The mean vector and covariance matrix for bare E(b) Var(b) = (c) Suppose the elements of e are correlated and variance are not all the same, i.e., Var(t) = I, but E() = 0. Then the mean vector and covariance matrix for the ordinary least squares estimator bare E(b) Var(b) = (d) A generalized least squares estimator for @ is b = (XE-X)-XE-Y. Using the assumptions in part (e), the mean vector and covariance matrix for bare E(b) = Var(b') = Both b and b' are unbiased estimators. Using the extended Cauchy-Schwartz In- equality to show that Var(a'b) Var(a'b') for any vector of constants a. Extended Cauchy-Schwarz Inequality: Let box and d be any two vectors and let Bax, be a positive definite (21) matrix. Then (b'd) b'Bb) (dB-d) with equality if and only if b=cB-d(or d=cBb) for some constant e.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock

a The mean vector and covariance matrix for Y are given by EY XB VarY Vare b The ordinary leas... View full answer

blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Computer Network Questions!