Question: Let the size (n) random sample (mathbf{Y}=left[Y_{1}, Y_{2}, ldots Y_{n}ight]^{prime}) be such that (mathbf{Y}=mathbf{x} beta+boldsymbol{varepsilon}), where (mathbf{x}) is a (n times 1) non-zero vector of
Let the size \(n\) random sample \(\mathbf{Y}=\left[Y_{1}, Y_{2}, \ldots Y_{n}ight]^{\prime}\) be such that \(\mathbf{Y}=\mathbf{x} \beta+\boldsymbol{\varepsilon}\), where \(\mathbf{x}\) is a \(n \times 1\) non-zero vector of "explanatory variable values", \(\beta\) is an unknown parameter value, and \(\boldsymbol{\varepsilon}\) is an \(n \times 1\) random vector for which \(\mathrm{E}(\boldsymbol{\varepsilon})=\mathbf{0}\) and \(\operatorname{cov}(\boldsymbol{\varepsilon})=\sigma^{2} \mathbf{I}\).
a. Is \(t(\mathbf{Y})=\sum_{i=1}^{n} x_{i} Y_{i} / \sum_{i=1}^{n} x_{i}^{2}\) the BLUE of \(\beta\) ? Explain.
b. Define the mean and the variance of the estimator in a).
c. Under what conditions will the estimator in
a) be a consistent estimator of \(\beta\) ?
d. Consider an alternative estimator \(t^{*}(\mathbf{Y})=\left(\mathbf{x}^{\prime} \mathbf{x}+kight)^{-1} \mathbf{x}^{\prime} \mathbf{Y}\) for \(\beta\). (this is an example of a so-called "ridge regression" estimator). Define the mean and variance of this estimator. Is this a linear estimator? Is it unbiased? Is it asymptotically unbiased?
e. Under what conditions will the estimator in
d) be a consistent estimator of \(\beta\) ?
f. If you wanted to use the estimator that has the smaller expected squared distance from \(\beta\), would you prefer one estimator over the other? Explain.
Step by Step Solution
3.38 Rating (167 Votes )
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
