Let (Y_{1}, ldots, Y_{n}) be independent (Nleft(beta, sigma^{2}ight)) random variables with parameter ((beta, sigma)) taking values in

Question:

Let \(Y_{1}, \ldots, Y_{n}\) be independent \(N\left(\beta, \sigma^{2}ight)\) random variables with parameter \((\beta, \sigma)\) taking values in \(\mathbb{R} \times \mathbb{R}^{+}\), and let the components of \(T\) be conditionally independent Bernoulli variables \(T_{i} \sim \operatorname{Ber}\left(r_{i}ight)\) as described in the preceding exercise. Using results from the preceding exercise, deduce as an approximation for large \(n\), that

\[
\begin{aligned}
E\left(Y_{i} \mid Tight) & =\beta+\sigma \gamma_{n} x_{i} \\
\operatorname{cov}\left(Y_{i}, Y_{j} \mid Tight) & =\sigma^{2}\left(1-\gamma_{n}^{2}ight) \delta_{i j}+\sigma^{2} \gamma_{n}^{2} / n
\end{aligned}
\]

where \(x_{i}=2\left(T_{i}-\bar{T}_{n}ight)\) is the normalized treatment vector, and \(\gamma_{n}\) is a sequence in \((0,1)\) whose limit is

\[
\lim _{n ightarrow \infty} \gamma_{n}=2 \int_{-\infty}^{\infty} x \Phi(x) \phi(x) d x=0.5642
\]

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: