Consider the following unbalanced one-way analysis of variance model [y_{i t}=mu_{i}+u_{i t} quad i=1, ldots, N quad

Question:

Consider the following unbalanced one-way analysis of variance model

\[y_{i t}=\mu_{i}+u_{i t} \quad i=1, \ldots, N \quad t=1,2, \ldots, T_{i}\]

where for simplicity's sake no explanatory variables are included. \(y_{i t}\) could be the output of firm \(i\) at time period \(t\) and \(\mu_{i}\) could be the managerial ability of firm \(i\), whereas \(u_{i t}\) is a remainder disturbance term. Assume that \(\mu_{i} \sim \operatorname{IIN}\left(0, \sigma_{\mu}^{2}\right)\) and \(u_{i t} \sim \operatorname{IIN}\left(0, \sigma_{u}^{2}\right)\) independent of each other. Let \(T\) be the maximum overlapping period over which a complete panel could be established \(\left(T \leqslant T_{i}\right.\) for all \(i\) ). In this case, the corresponding vector of balanced observations on \(y_{i t}\) is denoted by \(y_{b}\) and is of dimension \(N T\). One could estimate the variance components using this complete panel as follows:

\[\widehat{\sigma}_{u}^{2}=y_{b}^{\prime}\left(I_{N} \otimes E_{T}\right) y_{b} / N(T-1)\]

and

\[\sigma_{\mu}^{2}=\left[y_{b}^{\prime}\left(I_{N} \otimes \bar{J}_{T}\right) y_{b} / N T\right]-\left(\widehat{\sigma}_{u}^{2} / T\right)\]

where \(E_{T}=I_{T}-\bar{J}_{T}, \bar{J}_{T}=J_{T} / T\) and \(J_{T}\) is a matrix of ones of dimension \(T\). \(\widehat{\sigma}_{u}^{2}\) and \(\widehat{\sigma}_{\mu}^{2}\) are the best quadratic unbiased estimators (BQUE) of the variance components based on the complete panel. Alternatively, one could estimate the variance components from the entire unbalanced panel as follows:

\[\tilde{\sigma}_{u}^{2}=y^{\prime} \operatorname{diag}\left(E_{T_{i}}\right) y /(n-N)\]

where \(n=\sum_{i=1}^{N} T_{i}\) and \(E_{T_{i}}=I_{T_{i}}-\bar{J}_{T_{i}}\). Also, \(\sigma_{i}^{2}=\left(T_{i} \sigma_{\mu}^{2}+\sigma_{u}^{2}\right)\) can be estimated by \(\widetilde{\sigma}_{i}^{2}=y_{i}^{\prime} \bar{J}_{T_{i}} y_{i}\), where \(y_{i}\) denotes the vector of \(T_{i}\) observations on the \(i\) th individual. Therefore, there are \(N\) estimators of \(\sigma_{\mu}^{2}\) obtained from \(\left(\widetilde{\sigma}_{i}^{2}-\widetilde{\sigma}_{u}^{2}\right) / T_{i}\) for \(i=1, \ldots, N\). One simple way of combining them is to take the average

\[\tilde{\sigma}_{\mu}^{2}=\sum_{i=1}^{N}\left[\left(\tilde{\sigma}_{i}^{2}-\tilde{\sigma}_{u}^{2}\right) / T_{i}\right] / N=\left\{y^{\prime} \operatorname{diag}\left[\bar{J}_{T_{i}} / T_{i}\right] y-\sum_{i=1}^{N} \tilde{\sigma}_{u}^{2} / T_{i}\right\} / N\]

(a) Show that \(\tilde{\sigma}_{u}^{2}\) and \(\widetilde{\sigma}_{\mu}^{2}\) are unbiased estimators \(\sigma_{u}^{2}\) and \(\sigma_{\mu}^{2}\).

(b) Show that \(\operatorname{var}\left(\widetilde{\sigma}_{u}^{2}\right) \leqslant \operatorname{var}\left(\widehat{\sigma}_{u}^{2}\right)\) and \(\operatorname{var}\left(\widetilde{\sigma}_{\mu}^{2}\right) \leqslant \operatorname{var}\left(\widehat{\sigma}_{\mu}^{2}\right)\). 

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: