New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
elementary probability for applications
A Course In Probability Theory 3rd Edition Kai Lai Chung - Solutions
1.13. If t(X) -0, $(X2) -a2 , 0 0] .n=l[HINT: Consider 00 lim(l -r)1/2 L rllg'>[vn= 0]rtI 11=0 as in the proof of Theorem 8.4.6, and use the following lemma: if Pn is a decreasing sequence of positive numbers such thatthen Pn '" n-I / 2 .] ~ 21/2, k=1 Pk
1.*12. If g'>{a(o,oo) < oo} < 1, then Vn --+ v, Ln --+ L, both limits being finite a.e. and having the generating functions.[HINT: ConsIder limm-+oo L~=o g'J{vm -n}rn and use (24) of Sec. 8.5.]
1.11. If -00 < g(X) < 0, then g(X) = £'(-V-), where V = sup Sj.[HINT: If V n max 1 '5.} '5.n S}, then 1let n --+ 00, then t t: O. For the case ct(X)to S. Port.]l
1.*10. Define a sequence of r.v.'s {Y n, n E NO} as follows:Yo = 0, Yn+1= (Yn +Xn+d+, n E NO, where {Xn, n E N} is a stationary independent sequence. Prove that for each n, Y nand M n have the same distribution. [This approach is useful in queuing theory.]
1.*9. Prove thatimplies :?P(M 0, exists and is finite, say = X(A). Now use proposition (E) in Sec. 8.4 and apply the convergence theorem for Laplace transforms (Theorem 6.6.3). M8 n=1 n P[S, >0]
1.8. Prove that the left member of the equation in Exercise 7 is equal to[1 -I=9'(a' = n;S. = 01]-1 .n=l where a' = a[O,oo); hence prove its convergence.
1.7. Prove that{00 1 0; Sn O} exp E n f?>[Sn n=l[HINT: One way to deduce this is to switch to Laplace transforms in (6) of Sec. 8.5 and let A ---+ 00.]
1.6. If M < 00 a.e., then it has an infinitely divisible distribution.
1.5. Prove that n=O n=l and deduce that 9'[M -OJ-exp {i'= ~ 9'[5. > OJ}.n=l
1.*4. Prove (13) of Sec. 8.5 by differentiating (7) there; justify the steps.
1.3. Find an expression for the Laplace transform of Sa. Is the corre sponding formula for the Fourier transform valid?
1.*2. Under the conditions of Theorem 8.4.6, show that[{Sa} = -lim r Sn dqp. n-+oo J{a>n}
1.1. Derive (3) of Sec. 8.4 by considering${raeitSa} =f rn [r eitSndqp r eitSn dqpj.n=l J{a>n-l} J{a>n}
1.18. For an arbitrary random walk, if :?>{Vn EN: Sn > O} > 0, then L :?>{Sn :s 0, Sn+l > O} < 00.n Hence if in addition :?>{Vn EN: Sn :s O} > 0, then L 1:?>{Sn > O} -:?>{Sn+l > O}I < 00;n and consequently L q>{Sn > O} < 00. n n(-It[HINT: For the first series, consider the last time that Sn < 0;
1.17. The basic argument in the proof of Theorem 8.3.2 was the "last time in (-E, E)". A harder but instructive argument usmg the "first time" may be given as followsShow that for 1 :s m M M M L gn(E) :s L f~m) Lgn(2E).Il=m n=m ¥This fonn of condition (iii') is due to Hsu Pei; see also Chung and
1.16. Suppose @v(X) -0,° < cS'(X2) < 00, and f.L IS of the mteger lattIce type, then.9>{Sn2 = 0 i.o.} = 1.
1.15. Generalize Theorem 8.3.3 to (!Il2 as follows. If the central limit theorem applies in the fonn that Sn I .Jli converges in dist. to the unit nonnal, then the random walk is recurrent. [HINT: Use ExercIses I3 and 14 and Exercise 4 of § 4 3 This is sharper than Exercise 11 No proof of Exercise
1.14. Extend Lemma 2 in the proof of Theorem 8.3.3 as follows. Keep condition (i) but replace (ii) and (iii) by(iii) l:~-o Un (m) < cm2l:~_o Un (1);There exists d > 0 such that for every b > 1 and m > m(b):n Then (10) is true. *
1.13 .. Generalize Lemma 1 in the proof of Theorem 8.3.3 to gzd. For d = 2 the constant 2m in the right member of (8) is to be replaced by 4m2 , and"Sn < E" means Sn is in the open square with center at the origin and side length 2E.
1.*12. Prove that no truly 3-dimensional random walk, namely one whose.:-ommon distribution does not have its support in a plane, is recurrent. [HINT:There exists A >° such that A(3 )2 1 11 ~tiXi Il(dx)is a strictly positive quadratic fonn Q in (tI, f2, (3)' If 3 L Itil < A-I, i=I then 8.3
1.*11. Prove that in :!J?2 if the common distribution of the random vector(.\, Y) has mean zero and finite second moment, namely:leX) = 0,
1.10. Generalize Exercises 6 and 7 above to :!J?;J.
1.*9. Prove that the random walk with J(t)=e-1t1a , O
1.*8. Prove that the random walk with J(t) = e-1tl (Cauchy distribution) is recurrent.
1.*7. If there exists a 8 > 0 such that 10 dt sup < 00,then the random walk is not recurrent. [HINT: Use Exercise 3 of Sec. 6.2 to show that there exists a constant C(E) such that 11 -cos E-IX C(E) lIfE jU .'/)(ISIlI < E) :::: C(E) 2 f.Ln (dx) :::: --du Jut dt, :,7i'i x 2 0 -u where f.L1l is the
1.* 6. If there exists a 8 > 0 such that (the integral below being real valued)lim) = 00, rt1 -0 1 -rf(t)then the random walk is recurrent.
1.5. For a recurrent random walk that is neither degenerate nor of the lattice type, the countable set of points {Sn (w), n E N} is everywhere dense in,1£1 for a.e.w. Hence prove the following result in Diophantine approximation:if y is irrational, then given any real x and E > 0 there exist
1.* 4. Assume that 9{X 1= O} < 1. Prove that x is a recurrent value of the random walk if and only if 00 E 9i'){ISn xl < E} 00 for every E > O.1l=1
1.3. Prove the Remark after TheOIem 8.3.2.
1.*2. If a random walk in ql(;J is recurrent, then every possible value is a recurrent value.
1.1. Generalize Theorems 8.3.1 and 8.3.2 to r!Ad . (For d ~ 3 the general ization is illusory; see Exercise 12 belo..,,,.)
1.14. In an independent process where all Xn have a common bound,{{ex} < 00 implie£ l{Su} < 00 for each optional ex [cf. Theorem 5.5.3].
1.13. State and prove the analogue of Theorem 8.2.4 with a replaced by a[O,oo)' [The inclusion of° in the set of entrance causes a small difference.]
1.12. Prove the Corollary to Theorem 8.2.2.
1.*11. Let {Xn, 11 E }V} be a stationary' independent proeess and {ak' kEN}a sequence of strictly increasing finite optional r.v.'s. Then {Xetk+1, kEN} is a stationary independent process with the same common distribution as the original process [Tbis is the gambling-system theorem first given by
1.10. Generalize Theorem 8.2.2 to the case where the domain of definition and finiteness of a is ~ with° < g>(~) < 1. [This leads to a useful extension of the notion of independence. For a given A in '?7 with gz>(A) > 0, two events A and M, where M C b.., are said to be independent relative to b..
1.9. Find an example of two optional r.v.'s a and f3 such that a < f3 but f3 -a is not optional.
1.*8. Find an example of two optional r.v.'s a and f3 such that a .:::: f3 but::!/r; jJ ;/// However, if y is optional relative to the post-a process and f3 =a + y, then indeed ~; :J ~fJ. As a particular case, 97ifk is decreasing (while;/jt~k is increasing) as k increases.
1.7. If a and f3 are any two optional r.v.'s, then Xf3+/r et w) = Xet(W)+~(T"W)+/W);(rf3 0 ret)(w) = r~(T"w)+et(w)(w) #-r~+et(w) in general.
1.*6. Prove the following relations:cl = a 0 r~"-I ; r~"-I 0 ret = r~",' X 0 ret X ~k-I+j= ~k+j·
1.5. Vk EN: al + ... + ak is optional. [For the a in (11), this has been called the kth ladder variable.]
1.* 4. If a is optional and 13 is optional relative to the post-a process, then a + 13 is optional (relative to the original process).
1.3. If CYI and CYz are both optional, then so is CYI A CY2, CYI V CY2, CYI -t CY2. If a is optional and !:1 E ~a, then a,6. defined below is also optional:a,6. = {a +00 on !:1 on Q\!:1.
1.*2. For each optional a we have a E ~a and Xa E ~. If a and 13 are both optional and a ::; 13, then ~a C 'Jl{J.
1.* 1. a is optional if and only if 'in EN' {a < n} E4n
1.12. Let {Xn, n ::::. I} be independent r.v.'s with 9{Xn = 4-n} = g'){Xn =_4-1l} = ~. Then the remote field of {Sn, n ::::. I}, where Sn = 2:j=l Xj , is not triviaL
1.11. Show that the conclusion of TheOlem 8.1.4 holds tIUe for a sequence of independent r.v.' s, not necessarily stationary, but satisfying the following condition: for every j there exists a k > j such that Xk has the same distribu tion as Xj . [This remark is due to Susan Hom.]
1.*10. Consider the hi-infinite product space of all bi-infinite sequences of real numbers {WIl' n E N}, where N is the set of all integers in its natural(algebraic) ordering. Define the shift as in the text with N replacing N, and show that it is I-to-l on this space. Prove the analogue of (7).
1.9. Prove that an invariant event is remote and a remote event is permutable.
1.8. Find trivial examples of independent processes where the three numbers ?(r-1 A), ?P(A), ?P(rA) take the values 1,0, 1; or 0, ~, 1.
1.*7. The set {Y 21l E A i.o.}, where A E qj1, is remote but not necessarily invariant; the set {2:j=l Y j A i.o.} is permutable but not necessarily remote.Find some other essentially different examples of these two kinds.
1.* 6. If all > 0, limn--+oo all exists >° finite or infinite, and limn--+oo (an+ 1/ an)= 1, then the set of convergence of {an -1 2:j=l Yj } is invariant. If an ~ +00, the upper and lower limits of this sequence are invariant r.v.'s.
1.5. The set of convergence of an arbitrary sequence of r.v.'s {Yn, n EN}or of the sequence of their partial sums ~'j= 1 Y j are both permutable. Their limits are permutable r.v.'s with domain the set of convergence.
1.4. An r. v. is in variant [permutable] if and only if it belongs to the invariant [permutable] field.
1.3. If A is invariant then A = rA; the converse is false.
1.2. An r v belongs to the all-or-nothing field if and only if it is constant a.e.
1.1. Find an example of a remote field that is not the trivial one; to make It mteresting, insist that the r. v.' s are not identical.
1.*12. ReconSIder ExerCIse 17 of Sec. 6.4 and try to apply Theorem 7.6.3.[HINT: The latter is not immediately applicable, owing to the lack of unifonn convergence. However, show first that if eenit converges for tEA, where meA) > 0, then it converges for all t. This follows from a result due to
1.11. Strengthening Theorem 65 5, show that two infinitely divisible ch.f.' s may coincide in a neighborhood of° without being identical.
1.*10. Some writers have given the proof of Theorem 7.6.6 by apparently considering each fixed t and using an analogue of (21) WIthout the "sUPltl:51l "there. Criticize this "quick proof'. [HINT: Show that the two relations'v't and 'v'm: lim Umn (t) = U /1 (t), m--+oo Yt· lim l(1l (t) -u(t),
1.9. Let f(t) = 1 -t, fk(t) = 1 -t+ (-llitk-1 ,° 1. Then f k never vanishes and converges unifonnly to f in [0, 2]. Let ,JTk denote the distinguished square root of f k in [0, 2]. Show that ,JTk does not converge in any neighborhood of t = 1. Why is Theorem 7.6.3 not applicable? [This example is
1.*8. Give an example to show that in Theorem 7.6.3, if the unifonnity of convergence of kf to f is omitted, then kA need not converge to A. [HINT:kf (t) = exp{27Ti( -1 )kkt(1 + kt)-l }.]
1.7. Carry out the proof of Theorem 7.6.2 specifically for the "trivial" but instructive case f (t) = eait , where a is a fixed real number.
1.6. Show that the d.f. with density fiCtf(ex)-l xCt 1e-f3x, ex > 0, fi > 0, in(0, (0), and° otherwise, is infinitely divisible.
1.5. Show that f (t) = (1 -b) / (1 -beit ),° < b < 1, is an infinitely divis ible ch.f. [HI1\'T: Use canonical fonn.]
1.4. Give another proof that the right member of (17) is an infinitely divis ible ch.f. by using Theorem 6.5.6.
1.* 3. Let f be a ch.f. such that there exists a sequence of positive integers nk going to infinity and a sequence of ch.f.'s CfJk satisfying f = (CfJk)lIk ; then f is infinitely divisible.
1.2. If j IS an mfimtely dIvISIble ch.f. and X ItS dlstmgUIshed loganthm, r > 0, then the rth power of f is defined to be erA(t). Prove that for each r >°it is an infinitely divisible ch.f.
1.1. Is the convex combination of infinitely divisible ch.f.' s also infinitely divisible?
1.6. Prove that .q'){ISn I > cp(l -8, sn)i.o.} = 1, without use of (8), as follows. Let ek = {w: ISnk (w)1 h = {W:Sn .. , (w) -Sn,(w) > I'(1 -~,Sn'+') }.Show that for sufficiently large k the event ek n f k implies the complement of ek+ 1; hence deduceand show that the product -+ 0 as k -+
1.* 5. The law of the iterated logarithm may be used to supply certain coun terexamples. For instance, if the Xn 's are independent and Xn = ±n liZ/log log 1l with probability ~ each, then Sn/n ) 0 a.e., but Kolmogorov's sufficient conditi on (see case (i) after Theorem 5.4.1) 2: n t (X ~ ) / n 2
1.4. Prove that in Exercise 9 of Sec. 7.3 we have .9>{Sn 0 i.o.} 1.
1.*3. Let {X j, j > I} be a sequence of independent, identically distributed r.v.'s with mean 0 and variance 1, and Sn = 2:j-l Xj. Then 1.[HTh'T: Consider Snk+1 -Snk with nk ""' kk. A quick proof follows from Theorem 8.3.3 below.] Plim 00+11 |Sn(w)| n ! = 0} = 1.
1.*2. Prove that whenever (2) holds, then the analogous relations with Sn replaced by maxl::;m::;n Sm or maxl::;m::;n ISml also hold.
1.1. Show that condition (3) is fulfilled if the X/s have a common d.f.with a finite third moment.
1.4. Prove that for every x > 0: x 1+x x/2 e-y 12 dy J= x
1.*3 'fl . . .lete eXIsts a uIllvetsal constant Al > 0 such that for any sequence of independent, identically distributed integer-valued r. v.' s {X;} with mean 0 and variance 1, we have Al sup IF n (x) - (x)1 :::: 1'2 'x n I where Fn is the d.f. of (2:.1=1 X j )/:j1i. [HINT: Use Exercise 24 of
1.2. If J and g are ch.f.'s such that J(t) = get) for It I < T, then J \F(x) -00 G(x)1 ax ~ T' , Jr This is due to Esseen (Acta Math, 77(1944».
1.1. If F and G are d.f.' s with finite first moments, then I: IF(x) -G(x)1 dx < 00.[HINT: Use Exercise 18 of Sec. 3.2.]
1.9. Let {Xl > I} be independent r.v.' s with the symmetric Bemoullian distribution. Let Nn (w) be the number of zeros in the first n terms of the sample sequence {S/w),} ::: 1}. Prove that NI1/:Jn converges III dISt. to the same G as in Theorem 7.3.3. (HINT: Use the method of moments. Show that
1.8. Under the same hypothesis as III Theorem 7.3.3, prove that maxI
1.7. Deduce from Exercise 6 that for a symmetric stable LV. X with exponenta, 0 < a < 2 (see Sec. 6.5), there exists a constant c > 0 such that,:7>{IXI > nl/a} ~ cln. [This is due to Feller; use Exercise 3 of Sec. 6.5.]
1.6. If {Xn} are independent, identically distributed symmetric r.v.' s, then for every x ~ 0,
1.* 5. There are a ballots marked A and b ballots marked B. Suppose that these a + b ballots are counted in random order. What is the probability that the number of ballots for A always leads in the countmg?
1.* 4. Give an example of a sequence of independent and identically distributed r. v.' s [X n) with mean 0 and variance 1 and a sequence of positive integer-valued r.v.'s Vn tending to 00 a.e. such that Svjsvn does not converge in distribution. [HINT: The easiest way is to use Theorem 8.3.3 below.]
1.3. Let {X), v), j ~ I} be independent r. v.' s such that the v / s are integer valued, v) -+ 00 a.e., and the central limit theorem applies to (Sn -an)lbn, where Sn = l:J=1 X), an, bl1 are real constants, bn -+ 00. Then it also applies to (Svn aVn)Ibv/l'
1.* 2. Let {X), j ~ I} be a sequence of independent r. v.' s having the Bemoullian d.f. POI + (1 -P )00,0 < P < 1. An r-run of successes in the sample sequence {X)(w), j ~ I} is defined to be a sequence of r consecutive"ones" preceded and followed by "zeros". Let N n be the number of r-runs in the
1.1. Let {X), j ~ I} be a sequence of independent r. v.' s, and j a Borel measurable function of m variables. Then if ~k = j(Xk+I, ... ,Xk+m), the sequence {~b k ~ 1} is (m -I)-dependent.
1.*12. The following combinatorial problem is similar to that of the number of inversions. Let Q and 9 be as in the example in the text. It is standard knowledge that each permutation 2can be uniquely decomposed into the product of cycles, as follows. Consider the permutation as a mapping n from
1.11. Prove that J~oo x2 dF(x) < 00 implies the condition (11), but not vice versa.
1.*10. It is important to realize that the failure of Lindeberg's condition means only the failure of either (i) or (ii) in Theorem 7.2.1 with the specified constants Sn. A central limit theorem may well hold with a different sequence of constants I,et.') IJ , XJ' = ±J' , 0, 1 wim prooaoiliry 12
1.9. Let X j be defined as follows for some Ot > 1:(± 'a J , Xj =0, 1 with probability 6 '2(a 1) each; J 1with probability 1 - 3 '2(a 1) J Prove that Lindeberg's condition is satisfied if and only if Ot < 3/2.
1.8. For each j let X j have the uniform distribution in [-j, j]. Show that Lindeberg's condition is satisfied and state the resulting central limit theorem.
1.*7. Find an example where Lindeberg's condition is satisfied but Liapounov's is not for any 8 > 0.In Exercises 8 to 10 below {Xj , j 2: I} is a sequence of independent r.v.'s.
1.6. Prove that if 8
1.5. Derive Theorem 6.4.4 from Theorem 7.2.1.
1.*4. Prove the sufficiency part of Theorem 7.2.1 Theorem 7.1.2, but by elaborating the proof of the latter.expanSIOn and irx e . (tx)2 1 j ltx+e 2 for Ixl > 17 e = I + ltx ---+ e -for Ix I :s Y}. 2 6 ax . (txf , Itxl3[HINT: Consider without using[HINT: Use the As a matter of fact, Lindeberg's
1.*3. Prove that in Theorem 7.2.1, (i) does not imply (1).r. v. 's with nonnal distributions.]
1.2. Prove that Lindeberg's condition (1) implies that max ani -+ O.
1.1. Restate Theorem 7.1.2 in tenns of nonned sums of a single sequence.
Showing 1600 - 1700
of 3340
First
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
Last
Step by Step Answers