New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
modern mathematical statistics with applications
Probability And Measure Wiley Series In Probability And Mathematical Statistics 3rd Edition Patrick Billingsley - Solutions
=+This shows that the Etemadi and Ottaviani inequalities are of the same power for most purposes (as, for example, for the proofs of Theorem 22.7 and (37.9)).Etemadi's inequality seems the more natural of the two. Neither inequality can replace (9.39) in the proof of the law of the iterated
=+c) Carry the rightmost term in (22.22) to the left side, take s =t =a, and prove Ottaviani's inequality:T(@)(22.24)M(2a) ≤ Bo(a) =1^1-R(a)(d) Prove BE(@) ≤ 3Bo(@/2), Bo(@) ≤3BE(@/6).
=+(b) Take s = 2a and t = ax; use (22.22), together with the inequalities T(s) ≤L(s) and R(2s) ≤ 2L(s), to prove Etemadi's inequality (22.10) in the form(22.23)M(3a)≤BE(a) =1A3L(a).
=+(a) Following the first part of the proof of (22.10), show that(22.22)M(s+t) ≤T(1) +M(s+1)R(s).
=+22.15. Assume that X1 ,..., X ,, are independent and s, t, « are nonnegative. Let L(s) = max P[IS,12s], R(s) = max P[IS ,, - S,I>s], ksn ksm M(s) = P| max|S4|>s|, T (s) = P[S]]>s].k≤n
=+(c) Show by example that f need not be constant.
=+(b) Show that f"B is independent of each interval [0, x], and conclude that P(f~1B) is 0 or 1.
=+(a) Show that it is enough to prove that P( f"'B) is 0 or 1 for every Borel set B, where P is Lebesgue measure on the unit interval.
=+22.14. 22.121 Burstin's theorem. Let f be a Borel function on [0, 1] with arbitrarily small periods: For each € there is a p such that 0
=+(d) Show that if P(A)P(B) ≤ P(AB) for all BE, and if A €o(.Df), then P( A) is 0 or 1.(e) Reconsider Problem 3.20.
=+(c) Show that if aP(B) ≤ P(An B) for all B E .V/, and if a > 0 and A & o(C).then P(A) = 1.
=+(b) Show that if P(AnB) ≤P(A)P(B) for all BEM, and if AEG(M), then P( A) is 0 or 1.
=+(a) Show that if P( An B) < bP(B) for all B E.of, and if b < 1 and A EG(d), then P(A) = 0.
=+22.13. Suppose that of is a semiring containing 2.
=+22.12. Prove (what is essentially Kolmogorov's zero-one law) that if A is independent of a w-system " and A E o(9), then P(A) is either 0 or 1.
=+22.11. Suppose that X0, X1 ,... are independent and each is uniformly distributed over [0,2+]. Show that with probability 1 the series E ,, e'""z" has the unit circle as its natural boundary.
=+(b) Suppose that the X ,, have the same distribution and P[X, # 0] > 0. Show that r is 1 or 0 according as log +|X | has finite mean or not.
=+22.10. 22.1 1 (a) Show that for an independent sequence {X,) the radius of conver-gence of the random Taylor series E ,, X ,, z" is r with probability 1 for some nonrandom r.
=+22.9. 20.9 1 Let Z ,, be 1 or 0 according as at time n there is or is not a record in the sense of Problem 20.9. Let R ,, = Z, + . . . + Z ,, be the number of records up to time n. Show that R ,, /log n .p 1.
=+(b) Suppose that X ,, is + 1 with probabilities p and q, p +q, let + be the first n for which S ,, is -a or b (a and b positive integers), and calculate E[+]. This gives the expected duration of the game in the gambler's ruin problem for unequal p and q.
=+(a) Prove that(22.21)E[S,] = E[X]]E[+].
=+22.8. Wald's equation. Let X1, X2 ,... be independent and identically distributed with finite mean, and put S ,, = X, + . . . +X ,,. Suppose that ? is a stopping time: + has positive integers as values and [T = n] € o( X] ,..., X ,, ); see Section 7 for examples. Suppose also that E[+] < 00.
=+22.7. Suppose that X1, X2 ,... are independent and identically distributed and E[[ X][] =>. Use (21.9) to show that E ., P[IX ,, | ≥ an] = " for eacha, and conclude that sup, n "IX | = > with probability 1. Now show that sup ,, n-IS ,, I= w with probability 1. Compare with the corollary to
=+22.6. If X1, X2 ,... are independent and identically distributed, and if P[X, 2 0] = 1 and P[X] > 0] > 0, then E ,, X ,, = > with probability 1. Deduce this from Theorem 22.1 and its corollary and also directly: find a positive € such that X„ > c infinitely often with probability 1.
=+(b) Show that P[n- 1 max < < " X e-w/wx for x > 0. Relate to Theorem 14.3.
=+(a) Show that n "E"_ , X, does not converge with probability 1. Contrast with Theorem 22.1.
=+22.5. 20.14 22.11 Suppose that X1, X2, ... are independent, each with the Cauchy distribution (20.45) for a common value of u.
=+22.4. Show under the hypothesis of Theorem 22.6 that EX ,, has finite variance and extend Theorem 22.4 to infinite sequences.
=+(b) Construct independent, nonnegative X ,, such that EX ,, converges with probability 1 but LE[ X ,, ] diverges. For an extreme example, arrange that P[X]>0 i.o.]=0 but E[X]=œ.
=+22.3. 1 (a) Generalize the Borel-Cantelli lemmas: Suppose X ,, are nonnegative If LE[ X ,, ]
=+22.2. Assume {X,) independent, and define XXe) as in Theorem 22.8. Prove that for ElX ,, I to converge with probability 1 it is necessary that _P[|X |> c] and CE[[X(e)]] converge for all positive c and sufficient that they converge for some positivec. If the three series (22.13) converge but LE[I
=+22.1. Suppose that X1, X2 ,... is an independent sequence and Y is measurable o(X ,, X ... 1 ,... ) for each n. Show that there exists a constant a such that P[Y=a]=1.
=+21.21. Let X1, X2 ,... be identically distributed random variables with finite second moment. Show that nP[IX ,1 2 €Vn ] -> 0 and n-1/2 maxx < „IX,1 -+p 0.
=+1)/a. Show that the chi-squared distribution with n degrees of freedom has mean n and variance 2n.
=+21.20. 20.17 1 Show that the gamma density (20.47) has moment generating function(1 - s/a)=" for s
=+21.19. For independent random variables having moment generating functions, show by (21.28) that the variances add.
=+21.18. Use (21.28) to find the generating function of (20.39).
=+21.17. 16.6 | Show that a moment generating function M(s) defined in (-so, So), 56 > 0, can be extended to a function analytic in the strip [z: - 50) < Re z < so].If M(s) is defined in [0, 5g), so > 0, show that it can be extended to a function continuous in [ z: 0 ≤ Re z
=+21.16. For the density Cexp( -1x| "), - o < x < 0, show that moments of all orders exist but that the moment generating function exists only at s = 0.
=+21.15. 20.25 1 Write d((X, Y) = E[|X-Y|/(1 +IX-YD]. Show that this is a metric equivalent to the one in Problem 20.25.
=+21.14. 1 The integrability of X + Y does not imply that of X and Y separately.Show that it does if X and Y are independent.
=+21.13. Suppose that X and Y are independent and that f(x, y) is nonnegative. Put g(x)=E[f(x, Y)] and show that E[g(X)] = E[ f(X, Y)]. Show more generally that Jx € 4g( X) dP = [x=Af(X,Y) dP. Extend to f that may be negative.
=+E[X]E[Y]? Use the conventions (15.2) for both the product of the random variables and the product of their expected values. What if E[ X] = o and 0
=+21.12. Suppose that X and Y are independent, nonnegative random variables and that E[ X] =" and E[Y] =0. What is the value common to E[ XY] and
=+is the product of the moment generating functions of X' and Y'.
=+(c) Show that, despite dependence, the moment generating function of X' + Y'
=+(b) Show that X' and Y' are dependent but uncorrelated.
=+(a) Show that X', Y', and X' + Y' have the same one-dimensional distribu-tions as X, Y, and X + Y, respectively, even though (X', Y') and (X, Y) have different distributions.
=+21.11. 1 Let X, Y, and Z be independent random variables such that X and Y assume the values 0, 1, 2 with probability ; each and Z assumes the values 0 and 1 with probabilities { and {. Let X' = X and Y' = X + Z (mod 3).
=+221 .1<
=+21.10. (a) Show that uncorrelated variables need not be independent.(b) Show that Var[E; , X,] = Ef ,-, Cov[X ,, X,] = E"_ , Var[ X,] +
=+(d) Even if F has jumps, E[ F(X)] ={ + ¿E, P2[X=x].
=+independent, then E[ F(Y)] + E[G(X)] = 1+ P[X=Y].
=+(c) Even if F and G have common jumps, if X and Y are taken to be
=+(b) If F is continuous, then E[F(X)) =;.
=+(a) Show that if F and G have no common jumps, then E[F(Y)] + E[G(X)]=1.
=+21.9. Suppose that X and Y are random variables with distribution functions F and G.
=+(b) Let (X, Y] be a nondegenerate random interval. Show that its expected length is the integral with respect to t of the probability that it covers t.
=+21.8. (a) Suppose that X and Y have first moments, and prove E[Y]-E[X]= (P[X
=+21.7. Prove for integrable X that E[ X] = | P[X>[]dt - [" P[X
=+21.6. Prove (21.9) by Fubini's theorem.
=+21.5. Prove the first Borel-Cantelli lemma by applying Theorem 16.6 to indicator random variables. Why is Theorem 16.6 not enough for the second Borel-Cantelli lemma?
=+21.4. 20.141 Show that the Cauchy distribution has no mean.
=+21.3. 20.91 Records. Consider the sequence of records in the sense of Problem 20.9. Show that the expected waiting time to the next record is infinite.
=+21.2. Show that, if X has the standard normal distribution, then E[[ X12"+'] =2"n!//2/T .
=+differentiate k times with respect to t inside the integral (justify), and derive(21.7) again.
=+21.1. Prove( e-ux2 /2 dx =1-1/2.,00 12 T= 00
=+20.27. t Let 0 and $ be the longitude and latitude of a random point on the surface of the unit sphere in R3. Show that 0 and $ are independent, 0 is uniformly distributed over [0,2m), and $ is distributed over [-T/2, + ₸/2]with density ; cos ₺.
=+20.26. Construct in R& a random variable X that is uniformly distributed over the surface of the unit sphere in the sense that |X| = 1 and UX has the same distribution as X for orthogonal transformations U. Hint: Let Z be uniformly distributed in the unit ball in R4, define v(x) =x/|x| (4(0)
=+(d) Show that in general there is no metric do on this space such that X ,, - X with probability 1 if and only if do( X ,,, X) -0.
=+(c) Show that the space is complete.
=+(b) Show that X ,, - p X if and only if d( X ,, X) ->0.
=+(a) Show that d( X, Y)=0 if and only if X =Y with probability 1. Identify random variables that are equal with probability 1, and show that d is a metric on the resulting space.
=+'0.25. 20.21 20.24 1 Let d(X, Y) be the infimum of those positive € for which P[|X -YZE]SE.
=+(b) Show that discrete spaces are essentially the only ones where this equiva-lence holds: Suppose that P has a nonatomic part in the sense that there is a set A such that P(A) > 0 and P(LA) is nonatomic. Construct random variables X ,, such that X ,, - +p 0 but X ,, does not converge to 0 with
=+20.24. 2.191 (a) Show that in a discrete probability space convergence in probabil-ity is equivalent to convergence with probability 1.
=+20.23. If X, - 0 with probability 1, then n-1C"_ Xx -> 0 with probability 1 by the standard theorem on Cesaro means [A30]. Show by example that this is not so if convergence with probability 1 is replaced by convergence in probability.
=+(b) Show by example that in an infinite measure space functions can converge almost everywhere without converging in measure.
=+20.22. (a) Suppose that X, SX2 S ... and that X, -, X. Show that X, + X with probability 1.
=+(b) Show that X ,, - p X.
=+(a) Prove there is a subsequence {X ,, ) and a random variable X such that lim , X„ =X with probability 1. Hint: Choose increasing n, such that P[!X„ ,, - X„|>2 ~* ]2 ~* ].
=+20.21. Suppose that the sequence {X ,, ) is fundamental in probability in the sense that for e positive there exists an N, such that P[IX ,, - X,1>€] N ..
=+(b) Show that addition and multiplication preserve convergence in probability.
=+20.20. (a) Suppose that f: R2 - R' is continuous. Show that X ,, - p X and Y ,, -, Y imply f(X„ ,, Y ,, ) -+p f(X,Y).
=+20.19. Let A ,., (€) = [IZ, - Z|
=+20.18. 1 Let N, X1, X2 ,... be independent, where P[N =n] =q"-p, n 21, and each X, has the exponential density f(x;a, 1). Show that X, + . . . +X, has density f(x; ap, 1).
=+f ( - ;a, u) + f( ; a,c) = f( ;a, u +v).Note that (20.46) is f(x; 4, n/2), and from (20.48) deduce again that (20.46) is the density of x2. Note that the exponential density (20.10) is f(x; «, 1), and from (20.48) deduce (20.39) once again.
=+20.17. 1 The gamma distribution has density(20.47)f(x; a,u) = F(u)-x"-le -ax over (0, 0) for positive parameters « and u. Check that (20.47) integrates to 1.Show that(20.48)
=+20.16. 18.18 1 Let X1 ,..., X ,, be independent, each having the standard normal distribution. Show that 2-X? + · · · +X2 has density 1(20.46)x(7 /2)- 1e -x /2 2"/25(n/2)over (0, 0). This is called the chi-squared distribution with n degrees of freedom.
=+(b) Show that, if X has the uniform distribution over (- 1/2, 1/2), then tan X has the Cauchy distribution with u = 1.
=+20.15. 1 (a) Show that, if X and Y are independent and have the standard normal density, then X/Y has the Cauchy density with u = 1.
=+(b) Show that, if X1, ..., X ,, are independent and have density c ., then(X) + . . . +X„)/n has density c ,, as well.
=+for u > 0. (By (17.9), the density integrates to 1.)(a) Show thatc. *c, = c ...... Hint: Expand the convolution integrand in partial fractions.
=+20.14. The Cauchy distribution has density 1 4(20.45)c.(x) = =T u2 + x2'
=+20.13. Suppose that u and v consist of masses @ ,, and B ,, at n, n = 0, 1,2 ,.... Show that p * v consists of a mass of Ex-@ B ,, , at n, n = 0, 1, 2, ... . Show that two Poisson distributions (the parameters may differ) convolve to a Poisson distribution.
=+b) Show that, if x and y are atoms of F and G, then x + y is an atom of F * G.
=+(a) Show that, if x and y are points of increase of F and G, then x + y is a point of increase of F * G.
=+20.12. If F(x-€) < F(x +€) for all positive €, then x is a point of increase of F (see Problem 12.9). If F(x - ) < F(x), then x is an atom of F.
=+20.11. Suppose that X and Y are independent and have densities. Use (20.20) to find the joint density for (X + Y, X) and then use (20.19) to find the density for X + Y. Check with (20.38).
=+20.10. Use Fubini's theorem to prove that convolution of finite measures is commuta-tive and associative.
Showing 3900 - 4000
of 5198
First
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
Last
Step by Step Answers