# Question: Let X1 be independent random variables with

Let X1, . . . be independent random variables with the common distribution function F, and suppose they are independent of N, a geometric random variable with parameter p. Let M = max(X1, . . . ,XN).

(a) Find P{M ≤ x} by conditioning on N.

(b) Find P{M ≤ x|N = 1}.

(c) Find P{M ≤ x|N > 1}.

(d) Use (b) and (c) to rederive the probability you found in (a).

(a) Find P{M ≤ x} by conditioning on N.

(b) Find P{M ≤ x|N = 1}.

(c) Find P{M ≤ x|N > 1}.

(d) Use (b) and (c) to rederive the probability you found in (a).

**View Solution:**## Answer to relevant Questions

Let U1, U2, . . . be a sequence of independent uniform (0, 1) random variables. In Example 5i we showed that, for 0 ≤ x ≤ 1,E[N(x)] = ex, where This problem gives another approach to establishing that result. (a) Show by ...Consider a gambler who, at each gamble, either wins or loses her bet with respective probabilities p and 1 − p. A popular gambling system known as the Kelley strategy is to always bet the fraction 2p − 1 of your current ...The moment generating function of X is given by MX(t) = exp{2et − 2} and that of Y by MY(t) = (3/4et + 1/4)10. If X and Y are independent, what are (a) P{X + Y = 2}? (b) P{XY = 0}? (c) E[XY]? Consider n independent trials, each resulting in any one of r possible outcomes with probabilities P1, P2, . . . , Pr. Let X denote the number of outcomes that never occur in any of the trials. Find E[X] and show that, among ...Prove that if E[Y|X = x] = E[Y] for all x, then X and Y are uncorrelated; give a counterexample to show that the converse is not true. Prove and use the fact that E[XY] = E[XE[Y|X]].Post your question