New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
elementary probability for applications
A Course In Probability Theory 3rd Edition Kai Lai Chung - Solutions
1.21. The span of an integer lattice distribution is the greatest common divisor of the set of all differences between points of jump.
1.20. Show by using (14) that I cos tl is not a ch.f. Thus the modulus of a ch.f. need not be a ch.f., although the squared modulus always IS.
1.*19. Reformulate Exercise 18 in terms of dJ.'s and deduce the following consequence. Let F n be a sequence of dJ.' s an, a~ real constants, bn > 0, b;! > O. If where F is a nondegenerate dJ., then and ~o.[Two dJ.'s F and G such that G(x) = F(bx +a) for every x, where b > 0 and a IS real, are saId
1.*18. Let f and g be two nondegenerate chJ.'s. Suppose that there exist real constants all and bn > 0 such that for every t:Then an ~a, bll~b, where a is finite, 0 < b < 00, and g(t) = eita / b f(t/b).[HINT: Use Exercises 16 and 17.]
1.*17. Suppose Cn is real and that eCnit converges to a limit for every t in a set of strictly positive Lebesgue measure. Then Cn converges to a finite limit. [HINT: Proceed as in Exercise 16, and integrate over t. Beware of any argument using "logarithms", as given in some textbooks, but see
1.*16. Suppose bn > 0 and I/(bnt)1 converges everywhere to a ch.f. that is not identically 1, then bn converges to a finite and strictly positive limit.[HINT: Show that it is impossible that a subsequence of bn converges to 0 or to +00, or that two subsequences converge to different finite limits.]
1.*IS.1f 1111(1)1----+ 1 for every t as n----+ 00, and F" is the dJ. corre sponding to In, then there exist constants an such that F n (x + an )480. [HINT:Symmetrize and take all to be a median of F n.J
1.14. If 11 (t) I 1, 11 (t') I -1 and tit' is an irrational number, then f is degenerate. If for a sequence {tkJ of nonvanishing constants tending to 0 we have If (tdl = I, then f IS degenerate.
1.13. The converse part of Theorem 6.4.1 is false for an odd k. Example.F is a discrete symmetric dJ. with mass C /n2log n for integers n > 3, where o is the appropriate constant, and k = 1. [HINT: It is well known that the series I:: sin nt n log n 11 converges uniformly in t.]
1.*12. Let {Xj ' j > I} be independent, identically distributed r.v.'s with mean 0 and variance 1 Prove that both 11;=1 and ;=1 converge in dist. to
1.11. Let X and Y be independent with the common dJ. F of mean 0 and variance 1. Suppose that eX + Y)/~ also has the dJ. F. Then F -. [HINT:Imitate Theorem 6.4.5.]
1.10. Suppose F satisfies the condition that for every 1] > 0 such that as A ---+ 00,] dF(x) _ a(e 71,1).Ixl>A rThen all moments of F are finite, and condition (6) in Theorem 6.4.5 is satisfied.
1.9. Suppose that e-cltla, where c > 0,0 < a < 2, is a chJ. (Theorem 6.5.4 below). Let {X j, j > I} be independent and identically distributed r. v.' s with a common chJ. of the form as t ---+ O. Determine the constants band () so that the ch.f. of S n / bn e converges to g_ltl a .
1.8. IfO < a < 1 and J Ixla dF(x) < 00, then f(t) -1 = o(ltla ) as t -+ 0.For 1 < a < 2 the same result is true under the additional assumption that J x dF(x) = 0. [HINT: The case 1 < a < 2 is harder. Consider the real and imaginary parts of f (t) -1 separately and write the latter as 1 sintxdF(x)
1.7. Let f be the ch.f. of the dJ. F. Suppose that as t -+ 0, where a < a < 2, then as A -+ 00,[HINT: Integrate ~xl>A (l cos tx) dFex) < ct* over t in (0, A).]
1.*6. Prove that in Theorem 6.4.4, Sn/(JJ1i does not converge in proba bility. [HINT: Consider Sn/CJ..,fii and S2n/CJ.j2n.]
1.5. Let Xk have the Poisson distribution with parameter 2 Prove that[X).. -J....]/J.... 1/2 converges in dist. to as J.... -+ 00.
1.* 4. Let Xn have the binomial distribution with parameter en, PI!), and suppose that n Pn -+ J.... > 0. Prove that Xn converges in dist. to the Poisson d.f.WIth parameter X. (In the old days thIS was called the law of small numbers.)
1.3. Let q7>{X = k} = Pb 1 < k < .e < 00, 2:i=1 Pk = 1. The sum Sn of n independent r.v.' s having the same distribution as K is said to have a multinomial distribution. Define it explicitly. Prove that [Sn -g(Sn )]/(J(Sn)converges to o.
1.1· {U(ISn I) 2 l' @ (S: ) ~ 1m (c r::: = 1m (0 r::: = -(J 11--+00 V n n--+oo V n 7f[If we assume only q7>{X 1 #-o} > 0, £'( IX 11) < 00 and £'(X 1) = 0, then we have t(IS n I) > C.fii for some constant C and all n; this is known as Hornich's inequality.] [HINT: In case (J2 = 00, if limn 0'(1
1.*2. Let {Xn} be independent, identically distributed with mean° and vari ance (J2,° < (J2 < 00. Prove that
1.*1. If f is the ch.f. of X, and . f (t) -1 -2:2 11m = --> -00,then cf(X) = ° and e(X2) = (J2. In particular, if f(t) = 1 + 0(t2) as t -+ 0, then f = 1.
1.12. In the notation of Exercise 11, even if SUPtE~J I in (t) gn (t)1 ) 0, it does not follow that (Fn, Gn) ---+ 0; indeed it may ---+ 1. [HINT: Let I be any ch.f. vanishing outside (-1,1), Ij(t) = e !/ljt I(mjt), gj(t) = e injt I(mjt), and Fj , Gj be the corresponding dJ.'s. Note that if mjn7l
1.*11. Let Fn, Gn be the d.f.'s of /-Ln, Vn, and In, gn their ch.f.'s. Even if SUPXEg;>J IFn(x) Gn(x)I----+ 0, it does not follow that (fn,gnb ---+ 0; indeed it may happen that (In, gn h = 1 for every n. [HINT: Take two step functions"out of phase".]
1.10. Using the strong law of large numbers, prove that the convolution of two Cantor dJ.' s is still singular. [Hll\1'f: Inspect the frequency of the digits in the sum of the corresponding random series; see Exercise 9 of Sec. 5.3.]
1.9. Rewrite the preceding fonnula as sint C t)C t ~ -t=II cos 22k III cos 22k k=l k=l Prove that either factor on the right is the ch.f. of a singular distribution. Thus the convolution of two such may be absolutely continuous. [HINI. Use the same r.v.'s as for the Cantor distribution in Exercise
1.*8. Interpret the remarkable trigonometric identity. 00 Sillt = Ilcos ~t 2n n=l in tenns of ch.f.'s, and hence by addition of independent r.v.'s. (This is an example of Exercise 4 of Sec. 6.1.)
1. 7. If Fn---+ F and Gn ---+ G, then Fn * Gn ---+ F * G. [A proof of this simple result without the use of ch.f.'s would be tedious.]
1.* 6. If the sequence of ch.f. 's {In} converges unifonnly in a neighborhood of the origin, then {In} is equicontinuous, and there exists a subsequence that converges to a ch.f. [HINT: Use Ascoli-Arzela's theorem.]
1.5. Let F be a discrete d.f. with points of jump {a j, j > I} and sizes of jump {b j, j 2: I}. Consider the approximating s.d.f.' s F n with the same jumps v but restricted to j < n. Show that F n ---+ F.
1.4. Let F~, G'l be d.f.'s with ch.f.'s tn and gil' If tn -gil ---+ 0 a.e., then for each f E CK we have J f dFn J f dGn ---+ 0 (see Exercise 10 of Sec. 4.4). This does not imply the Levy distance (F n, Gn)! ---+ 0; find a counterexample. [HINT: Use Exercise 3 of Sec. 6.2 and proceed as in Theorem
1.3. Let F be a gIven absolutely contmuous dJ. and let FIl be a sequence of step functions with equally spaced steps that converge to F uniformly in 2/?!. Show that for the corresponding ch.f.' s we have\In: sup If(t) -fll(t)1 = 1.tEJ/?1
1.*2. Instead of using the Lemma in the second part of Theorem 6.3.2, prove that {L is a p.m. by integrating the inversion formula, as in Exercise 3 of Sec. 6.2. (IntegratIon IS a smoothmg operatIOn and a standard techmque m taming improper integrals: cf. the proof of the second part of Theorem
1.1. Prove the uniform convergence of f n in Theorem 6.3.1 by an inte gratIOn by parts of J eitx dF n (x).
1.1. dF(y)11 00 sin(x -y)t dtl < cll, dF(y)Y,,"X N t Ix-YI::::1/N Nix -)'1+ C2 r dF(y),}O
1.14. There is a deeper supplement to the inversion formula (4) or Exercise] 0 above, due to B Rosen TInder the condition J (1 + log Ixl)dF(x) < 00, -00 the improper Riemann integral in Exercise 1 D may be replaced by a Lebesgue integral. [HINT. It is a matter of proving the existence of the
1.13. The uniqueness theorem holds as well for signed measures [or fUDc tions of bounded variations]. Precisely, if each J-Li, i = 1, 2, is the difference of two finite measures such that then f.-Ll = f.-L2·
1.* 12. Prove Theorem 6.2.2 by the Stone-Weierstrass theorem. [HINT: Cf.Theorem 6.6.2 below, but beware of the differences Approximate uniformly gl and g2 in the proof of Theorem 4.4.3 by a periodic function with "arbitrarily large" period.]
1.11. Theorem 6.2.3 has an analogue in L2. If the ch.f. f of F belongs to L2, then F is absolutely continuous. [HINT: By Plancherel's theorem, there exists cp E L 2 such that r cp(u) du = ~ ['JO e-itX._ 1 f (t) dt.Jo v 2rr Jo -It Now use the inversion formula to show that 1(X F(x) -F(D) = ~ sp(u)
1.10. Prove the following form of the inversion formula (due to Gil Palaez):1 1 iT eitx f (-t) -e-itx f (t) -{F(x+) + F(x-)} = -+ lim . dt. 2 2 6,,0 0 2mt Ttoo[HINT: Use the method of proof of Theorem 6.2.1 rather than the result.]
1.*9. Give a trivial example where the right member of (4) cannot be replaced by the Lebesgue integral 11 00 e-itX1 _ e-itX2 - R . f(t)dt. 2n -00 If But it can always be replaced by the improper Riemann integral: lim 11--2 T-+x S e-itx - e-itx2 R -f (t) dt. it
1.8. Prove that for 0 < r < 2 we have where~r J-cosu f C(r) = I 1,+1 du J-oo u thus C (1) = lin. [HINT:r(r + 1) . rn --n--sm -2 'l: I -cosxt Ixlr = C(r) +1 dt.]OJ It I r
1.*7. If F is absolutely continuous, then limltl_oo f (t) = O. Hence, if the absolutely continuous part of F does not vanish, then liml~CXJ If (t) I < 1. If F is purely discontinuous, then limt_oo f (t) = 1. [The first assertion is the Riemann-Lebesgue lemma; prove It first when F has a denSIty
1.6. For each 11 > 0, we have 11 00(Sint)n+2 1 2 1 u --dt = ((In (t) dt du, n -00 too where ((Jl = ~ 1 [-1.1] and ((In = ((In -1* ((Jl for n > 2.
1.5. What is the special case of the inversion fonnula when f = I? Deduce also the following special cases, where a > 0:1 1 00 sin at sin t -2 dt = a /\ 1, n -x t 11 00 sin arCsin t)2 a2 -3 dt = a --for a < 2; 1 for a > 2. n -00 t 4
1.4. If f (t)/t ELI (-00,00), then for each a > 0 such that ±a are points of continuity of F, we have 11 00 sinat F(a) -F(-a) = ---f(t)dt.JT -00 t
1.*3. Prove that fOI each ex > o.I [F(x + u) -F(x -u)] du = -j --t-2 -e ltx f(t)dt. Jo JT -00 ra 1 rOO 1 cos at As a sort of reciprocal, we have l1 a l u 1 00 1 -cos ax -du f (t) dt = 2 dF(x). 2 0 -u -00 X
1.*2. Show that for each T > 0:1 1 00 (1 -cos Tx) cos tx 0 -2 dx = (T -I t I) v .JT -00 X Deduce from this that fOI each T > 0, the function of t given by is a ch.f. Next, show that as a particular case of Theorem 6.2.3, 1 -cos Tx J (1' Itl)eitJ: dt.2 -T Finally, derive the following particularly
1.1. Show that 100(SinX) 2 _ JT --dx--. o x 2
1.*18. Let the support of the p.m. f.L on gel be denoted by supp f.L. Prove that supp (IL * LJ) c10sme of sopp J.L + sopp v, supp EM! * M2 * ) closure of (supp f.i;l + supp f.i;2 + ... )where "+" denotes vector sum.
1.17. Let F be a symmetric d.f., with ch.f. f >° then,00,2 ~oc((JF(h) =J 00 h2~ x2 dF(x) = h .[ e-ht f (t) dt is a sort of average concentration function. Prove that if G is also a d f with ch.f. g > 0, then we have Vh > 0:1 -({JPG(h) < [1 -({JF(h)] + [1 -({JG(h)].
1.16. If° < h)" < 2n, then there is an absolute constant A such that A (' QF(h) < ).. io If (t)1 dt, where f is the ch.f. of F. [HINT: Use Exercise 2 of Sec. 6.2 below.]
1.*15. For a d.f. F and h > 0, define QF(h) = sup[F(x + h) -F(x-)];x QF is called the Levy concentration function of F. Prove that the sup above is attained, and if G is also a d.f., we have
1.14. Find an example of two r.v.'s X and Y with the same p.m. f.L that are not independent but such that X + Y has the p.m. f.L * f.L. [HINT: Take X = Y and use chJ.]
1.13. For any chJ. f we have for every t:R[l -f(t)] > ~R[l -f(2t)].
1.12. Let {Xj , 1 < j < n} be independent r.v.'s each having the dJ. .Find the chJ. of n 20: X]j 1 and show that the conesponding p.d. is 2-n/2P(n/2)-lx(n/2l-1e-x/2 in (0, (0).This is called in statistics the "X2 distribution with n degrees of freedom".
1.11. Let X have the normal distribution . Find the d.f., p.d., and ch.f.OfX2.
1.*10. Let {Xj , j > I} be a sequence of independent r.v.'s having the common exponential distribution with mean 1 lA, A > 0 For given x > 0 let v be the maximum of n such that Sn < x, where So = 0, Sn = 2:J=l Xj as usual. Prove that the r.V. v has the Poisson distribution with mean AX. See Sec.
1.9. Find the nth iterated convolution of an exponential distribution.
1.8. Show that the family of normal (Cauchy, Poisson) distributions is closed with respect to convolution in the sense that the convolution of any two in the family with arbitrary parameters is another in the farruly with some parameter(s ).
1.7. The convolution of two discrete distributions with exactly m and n atoms, respectIvely, has at least m + n -1 and at most mn atoms.
1.* 6. Prove that the convolution of two discrete dJ.' s is discrete; that of a continuous dJ. with any dJ. is continuous; that of an absolutely continuous d.f. with any dJ. is absolutely continuous.
1.5. If F 1 and F 2 are d.f.' s such that Fl = ~bl)aj jand F 2 has density p, show that F 1* F 2 has a density and find it.
1.4. Let Sn be as in (v) and suppose that Sn -+ Soo in pro Prove that 00 converges in the sense of infinite product for each t and is the chJ. of Soo.
1.3. Find the d.f. with the following ch.f.'s (a > 0, {3 > 0):a2 1 1 a2 + t2 ' (l -ait)f3' (l + a{3 -a{3eit )l/f3'[HINT: The second and third steps correspond respectively to the gamma and P6lya distributions.]
1.*2. Let feu, t) be a function on ( 00, (0) x ( 00, (0) such that for each u, f (u, .) is a ch.f. and for each t, f (-, t) is a continuous function; then[00 .J -00 feu, t) dG(u)is a ch.f. for any d.f. G. In particular, if f is a ch.f. such that limt~oo f(t)exists and G a d.f. with G(O) 0, then~OO
1.1 .. If f is a ch.f., and G a dJ. with G(O-) -0, then the following functions are all ch.f.' s:t f(ut) du, Jo[00 f(ut)e-U du, Jo rOO 10 f (ut) dG(u).
1.12. Let 9O{X1 = k} = Pb 1 < k < .e, L~=1 Pk = 1. Let N(n, w) be the number of values of j, 1 < j < n, for which X} k and Prove that lI--+OC n1 lim -log IICn , w) exists a.e.and find the limit. [This is from information theory.]
1.* 11. Let 1 be continuous and belong to Lr (0, (0) for some r > 1, and g()..) = [00 e-At 1(t) dt.Jo Then f(x)n-->oo (n -1)!( _1)n-1 lim (:)ng(n_l)(:), where gCn-l) is the (n -l)st derivative of g, unifonnly in every finite interval.[HINT: Let ), > 0, JP{X1 (,l .. ) < t} -1 e-At . Then and
1.10. Let r be a positive integer-valued r. v. that is independent of the Xn 's.Suppose that both r and X 1 have finite second moments, then a-2(Sr) = e''(r)a-2(XJ) + a-2 (r)(ct(X1))2.
1.*9. In Exercise 7, find the dJ. of XIJ(t) for a given t. (g'{Xv(t)} is the mean lifespan of the object living at the epoch t; should it not be the same as t {X 1}, the mean lifespan of the given species? [This is one of the best examples of the use or misuse of intuition in probability theory.]
1.8. Theorem 5.5.3 remains true if J(X1) is defined, possibly +00 or -00.
1.*7. Consider the special case of renewal where the r.v.'s are Bernoullian taking the values 1 and 0 with probabilities p and 1 -p, where 0 < p < 1.Find explicitly the dJ. of v(o) as defined in Exercise 6, and hence of vet)for every t> 0. Find {{vet)} and {{v(t)2}. Relate v(t,w) to the N(t,w) in
1.*6. POI each t > 0, define o(t,(O) min{n.ISn(w)1 > t}if such an n exists, or +00 if not. If gp(X 1 =1= 0) > 0, then for every t > 0 and r> 0 we have .:.?-'{ vet) > n} ~ An for some A < 1 and all large n; consequently cf {v(tY} < 00. This implies the corollary of Theorem 5.5.2 without recourse to
1.5. If {,(Xl) > 0, then,l,i':'oo PI>{Q [Sn :5 III = o.
1.*4. Let Sn and N(t) be as m Theorem 5.5.2. Show that 00 e?{N(t)} = Lq>{Sn ~ t}.n=l'IliiS remams true If X 1 takes both pOSItIve and negatIve values.
1.3. Find the distribution of Yllk , ] < k < n, in (1) [These r v.'s are called order statistics.]
1.*2. Let F n and F be as in Theorem 5.5.1; then the distribution of sup IF n (x, (0) F(x)1 -oo
1.1. Show that equality can hold somewhere in (1) with strictly positive probability if and only if the discrete part of F does not vanish.
1.13. Under the assumptions in Theorem 5.4.2, if Sn/n converges a.e. then/ (IXII) < 00. [Hint: Xnln converges to° a.e , hence s:P{ I Kill> 11 i a } 0;use Theorem 4.2.4 to get Ln :-1'{ IX 11 > n} < 00.]
1.12. If [(Kd =1= 0, then max}
1.11. Prove the second alternatIve m Theorem 5.4.3.
1.10. Suppose there exist ana, 0 < a < 2, a =1= 1, and two constants A and A2 such that 1'rIn, 'rIx > 0: Al A2 -::::: 0O{IXnl > x} < -. xa -xa If a > 1 , suppose also that f (Xn ) o for each n Then for any sequence {an}increasing to infinity, we have[ThI~ ~esult, due to P. Levy and MarcmkiewIcz,
1.9. Construct an example where I(X+) = "'(X-)-+00 [ L 1 (r 1 -+00 and S In ~ a.e. Hll\7: e6 0 < a < fJ < 1 and take adJ. F such that 1 _ F (X;I,......, x-a as x ~ 00, and J-oo Ixltl dF(x) < 00. Show that L jj){ m~x X-!-::::: n Ila'} < 00 n I::::J::::n J for every a' > a and use Exercise 3 for ~n
1. *8. If (rCIXII) < 00, then the sequence II n IS um S nl n ~ (; (X I) in L I as well as a.e.
1.7. We have SII/n ~ 0 a.e. if and only if the following two conditions are satisfied:(i) Sn/n ~ a in pr.,(ii) S 2" 12n ~ 0 a.e.;an alternative set of conditions is (i) and(iii) 'VE > 0: LII ?J>(IS2"+' -S2"! > 21lE) < 00.{S I }. . fonnly integrable and
1.6. Let 0'(X 1 ) -0 and {ell} be a bounded sequence of real numbers. Then 1 n[HINT: Truncate XII at n and proceed as in Theorem 5.4.2.]
1.5. Let Xn take the values ±n8 with probability ~ eaeh. If 0 .::::: 0 ~? [HINT: To answer the question, use theorem 5.2.3 or Exercise 12 of Sec. 5.2; an alternative method is to consider the characteristic function of S,dn (see Chapter 6).]
1.4. Both Theorem 5.4.1 and its complement in Exercise 2 above are "best possible" in the following sense. Let {all} and 0, Then there exists a sequence of independent and identically distributed r.v.'s{Xn} such that [(Xn) = 0, $( an} = 00 by letting each Xn take two or three values only according
1.3. Let {Xn} be independent and identically distributed r.v.'s such that!(IXIIP) < 00 for some p:O < p < 2; in case p> 1, we assume also that/ (X I ) = O. Then S n n -(1/ P )-E ~ 0 a.e. For p = 1 the result is weaker than Theorem 5.4.2.
1.*2. There is a complement to Theorem 5.4.1 as follows. Let {an} and
1.*1. If £(Xi) = +00, leX}) < 00, then Snln ~ +00 a.e.
1.*10. If Ln ±Xn converges a.e. for all choices of ±1, where the Xn's are arbitrary r.v.'s, then Ln Xn 2 converges a.e. [HINT: Consider Ln rn (t)Xn (w)where the r,1 's are coin-tossing r v's and apply Fubinj's theorem to the space of (t, w).]
1.*9. Let {X n} be independent and identically distributed, taking the values o and 2 with probability ~ each; then converges a.e. Prove that the limit has the Cantor d.t. discussed In Sec. 1.3.Do Exercise 11 in that section again; it is easier now.
1.8. Let {Xn}, where n = 0, ±1, ±2, ... , be independent and identically distributed according to the normal distribution
1.7. For arbitrary {Xn}, if L cf(IXn I) < 00, n then LII Xn converges absolutely a.e.
1.* 6. The following analogue of the inequalities of Kolmogorov and Otta viani is due to P. Levy. Let Sn be the sum of n independent r.v.'s and S~ = Sn -mo(Sn), where mo(Sn) is a median of Sn' Then we have 2p{m~x ISJI > E}:S 3gp{IS~1 > ~}. lS;.JS;.n 2[HINT: Try "4" in place of "3" on the right.]
1.*5. But neIther Theorem 5.3.2 nor the alternatIve indIcated In the prece ding exercise is necessary; what we need is merely the following result, which is an easy consequence of a general theorem in Chapter 7. Let {X n} be a sequence of independent and uniformly bounded I. v.' s with 0 2 (8 n)
1.4. Let {XI!, X~ , n > I} be independent r v ' s sJ]ch that Xn and X~ have the same distribution. Suppose further that all these r.v.'s are bounded by the same constant A. Then nconverges a.e. if and only if nlIse Exercise 3 to prove this without recourse to Theorem 5.3 3, and so finish the
1.3. Theorem 5.3.2 has the following companion, which is easier to prove.Under the joint hypotheses in Theorems 5.3.1 and 5.3.2, we have(A + E)2 .9'{ max IS'I < E} < ---J --(J2 (Sn) . lS;.jS;.n
Showing 1900 - 2000
of 3340
First
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
Last
Step by Step Answers