Let (X) and (Y) be random variables (not necessarily independent) and suppose we wish to estimate the

Question:

Let \(X\) and \(Y\) be random variables (not necessarily independent) and suppose we wish to estimate the expected difference \(\mu=\mathbb{E}[X-Y]=\mathbb{E} X-\mathbb{E} Y\).

(a) Show that if \(X\) and \(Y\) are positively correlated, the variance of \(X-Y\) is smaller than if \(X\) and \(Y\) are independent.

(b) Suppose now that \(X\) and \(Y\) have cdfs \(F\) and \(G\), respectively, and are simulated via the inverse-transform method: \(X=F^{-1}(U), Y=G^{-1}(V)\), with \(U, V \sim \mathscr{U}(0,1)\), not necessarily independent. Intuitively, one might expect that if \(U\) and \(V\) are positively


correlated, the variance of \(X-Y\) would be smaller than if \(U\) and \(V\) are independent. Show that this is not always the case by providing a counter-example.

(c) Continuing (b), assume now that \(F\) and \(G\) are continuous. Show that the variance of \(X-\) \(Y\) by taking common random numbers \(U=V\) is no larger than when \(U\) and \(V\) are independent. [Hint: Use the following lemma of Hoeffding [41]: If \((X, Y)\) have joint cdf \(H\) with marginal cdfs of \(X\) and \(Y\) being \(F\) and \(G\), respectively, then \[ \operatorname{Cov}(X, Y)=\int_{-\infty}^{\infty} \int_{-\infty}^{\infty}(H(x, y)-F(x) G(y)) \mathrm{d} x \mathrm{~d} y \]
provided \(\operatorname{Cov}(X, Y)\) exists.]

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Data Science And Machine Learning Mathematical And Statistical Methods

ISBN: 9781118710852

1st Edition

Authors: Dirk P. Kroese, Thomas Taimre, Radislav Vaisman, Zdravko Botev

Question Posted: