# Question

In the text, we noted that

when the Xi are all nonnegative random variables. Since an integral is a limit of sums, one might expect that

whenever X(t), 0 ≤ t < ∞, are all nonnegative random variables; and this result is indeed true. Use it to give another proof of the result that, for a nonnegative random variable X,

Define, for each nonnegative t, the random variable X(t) by

Now relate

when the Xi are all nonnegative random variables. Since an integral is a limit of sums, one might expect that

whenever X(t), 0 ≤ t < ∞, are all nonnegative random variables; and this result is indeed true. Use it to give another proof of the result that, for a nonnegative random variable X,

Define, for each nonnegative t, the random variable X(t) by

Now relate

## Answer to relevant Questions

We say that X is stochastically larger than Y, written X ≥st Y, if, for all t. P{X > t} ≥ P{Y > t} Show that if X ≥st Y, then E[X] ≥ E[Y] when (a) X and Y are nonnegative random variables; (b) X and Y are arbitrary ...In Example 2h, say that i and j, i ≠ j, form a matched pair if i chooses the hat belonging to j and j chooses the hat belonging to i. Find the expected number of matched pairs. How many times would you expect to roll a fair die before all 6 sides appeared at least once? If X and Y are independent and identically distributed with mean μ and variance σ2, find E[(X − Y)2] Let X1, . . . be independent with common mean μ and common variance σ2, and set Yn = Xn + Xn+1 + Xn+2. For j ≥ 0, find Cov(Yn, Yn+j).Post your question

0