New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
elementary probability for applications
Probability An Introduction 2nd Edition Geoffrey Grimmett, Dominic Welsh - Solutions
12. Draw the regions in question in the (x, y)-plane. It is useful to prove that R2 =X2 + Y 2 and 2 = tan−1(Y/X) are independent, having an exponential and uniform distributions, respectively.(a) 1 − exp(−1 2a2/σ2).(b) α/(2π).
9. fY (y) = 1 4 (3y + 1)e−y for 0 < y < ∞.11. Use Theorem 6.62 with g(x, y) =p x2 + y2 and change to polar coordinates. The variance equals σ2(2 − 1 2π).
6. If you can do Problem 6.9.4 then you should be able to do this one.P(U ≤ x, V ≤ y) = F(y)n − [F(y) − F(x)]n for x < y
5. Show that G(y) = P(Y > y) satisfies G(x + y) = G(x)G(y), and solve this equation. The corresponding question for integer-valued random variables appears at Problem 2.6.5.
4. min{X, Y} > u if and only if X > u and Y > u.
1. For the first part, find the joint density function of X and XY by the method of change of variables, and then find the marginal density function of XY.2. No.3. The region(x, y, z) : √4xz < y ≤ 1, 0 ≤ x, z ≤ 1has volume 5 36 + 1 6 log 2.
14. Find P(Y ≤ y) for y ∈ R.
2 . This fails to occur if and only if the disjoint union A0 ∪ A1 ∪ · · · ∪ An occurs, where A0 is the event there is no break in (0, 1 2 ], and Ak is the event of no break in (Xk , Xk + 1 2 ] for k ≥ 1 (remember the permanent break at 1). Now, P(A0) = ( 1 2 )n, and for k ≥ 1, P(Ak) =Z
what is the
13. Let Xk be the position of the kth break (in no special order). The pieces form a polygon if no piece is longer than the sum of the other lengths, which is equivalent to each piece having length less than 1
what is the
12. Assume that the centre is uniform on the rectangle [0, a] × [0, b], and that the acute angle θ between the needle and a line of the first grid is uniform on [0, 1 2π]. There is no intersection if and only if the centre lies within an inner rectangle of size(a − ℓ cos θ) × (b − ℓ
what is the
11. This distance has distribution function (2/π) tan−1 x for 0 ≤ x < ∞.
what is the
10. fY (y) =3(1 − y)2 exp−y + 2 1 − yfor −2 < y < 1.
define of the
7. Integrate by parts. You are proving that E(X) =R P(X > x) dx, the continuous version of Problem 2.6.6.8. Apply the conclusion of Problem 5.8.7 to Y = g(X), express the result as a double integral and change the order of integration.
6. Note that x ≤ F(y) if and only if F−1(x) ≤ y, whenever 0 < F(y) < 1.
11. This is essentially a reprise of Problem 4.5.8.3. n ≥ 4.4. fY (y) = √2/π exp(−1 2 y2) for y > 0. √2/π and 1 − (2/π).5. Let F−1(y) = sup{x : F(x) = y}. Find P(F(X) ≤ y).
10. P(A wins) = a/(a + b − ab). The mean number of shots is (2 − a)/(a + b − ab).
9. This is an alternative derivation of the result of Problem 3.6.12.
2 )n+1 pn−r (2 − p)r+1.6. For the third part, find the real part of GX (θ), where θ is a primitive complex root of unity.8. GX (s) = GN ( 1 2 + 1 2 s), giving by independence that G = GN satisfies the functional equation G(s) = G( 1 2 + 1 2 s)2. Iterate this to obtain G(s) = G????1 + (s −
3. 9 19 , 6 19 , 4 19 . The mean number of throws is 3.4. [q/(1 − ps)]N . The variance is Np(1 − p)−2.5. The first part of this problemmay be done by way of Theorem4.36, with N+1 having a geometric distribution and the Xi having the Bernoulli distribution. Alternatively, use the methods of
1. Note that P(X = k) = uk−1 − uk .2. ( 1 6 )713!6! 7! − 49.
6 . To locate the extremal probabilities, find the number of ways in which the various possible outcomes can occur.For example, P(X = x) is maximized at x = 7. To verify (in)dependence, it is convenient to use a simple fact of the type P(X = 3, Y = 0) = 0.
16. (a) The means are 7 and 0, and both variances equal 35
15. var(Un) = (n − 1)pq − (3n − 5)( pq)2.
14. Condition on the value of N. X has the Poisson distribution with parameter λp.
13. c[1 − (1 − c−1)n], by using indicator functions.
12. In calculating the mean, remember that the expectation operator E is linear. The answer here is c1 + 1 2 + 1 3 + · · · + 1 c, a much more elegant solution than that proposed for Problem 2.6.7.
11. Adapt the hint for Problem 3.6.10.
10. Let Zi be the indicator function that the i th box is empty. The total number of empty boxes is S = Z1 + Z2 + · · · + ZM. Also, E(Zi ) = (M − 1)N /MN and E(S) =ME(Z1).
7. Use Theorem 2.42 with Bi = {N = i − 1}.9. (a) 1 2 , (b) 1 6 (3√5 − 1), (c) 5 6 .
6. Let 1k be the indicator function of the event that, when there are 2k ends, a new hoop is created at the next step. Then E(1k ) = k????2k 2= 1/(2k − 1). The mean final number of hoops is Pn k=1 E(1k ).
5. P(U > k) = P(X > k)P(Y > k).
4. P(Un = k) = P(Un ≥ k) − P(Un ≥ k + 1), and P(Un ≥ k) =1 −k − 1 Nn.
1. Use the result of Exercise 1.35, with Theorem 3.27.2. a = b = 1 2 . No.
9. (1 − pn)/[pn(1 − p
8. This is sometimes called Banach’s matchbox problem. First, condition on which pocket is first emptied. You may find the hint more comprehensible if you note that 2(n − h)ph = (2n − h) ph+1. The mean equals (2n + 1)p0 − 1.
7. This generalizes the result of Problem 2.6.6.
6. The summation here is P∞k=0 P∞i=k+1 P(X = i ). Change the order of summation.For the second part, use the result of Exercise 1.20.
5. For the last part, show that G(n) = P(X > n) satisfies G(m + n) = G(m)G(n), and solve this relation.
4. α < −1 and c = 1/ζ(−α), where ζ(p) =P k k−p is the Riemann zeta function.
2. Use Theorem 2.42 with X and Bi chosen appropriately. The answer is m(r ) = r/p.3. E(X2) =P x2P(X = x), the sum of non-negative terms.
19. Show n = 6.
16. Conditional probabilities again. The answer is 1 4 (2e−1 + e−2 + e−4).18.Sn i=1 Ai →S∞i=1 Ai as n →∞.
15. Use the result of Problem 1.11.14(a).
14. (a) Induction. (b) Let Ai be the event that the i th key is hung on its own hook.
. Use the Partition Theorem 1.48 to obtain the difference equations. Either iterate these directly to solve them, or set up a matrix recurrence relation, and iterate this.
12. To do this rigorously is quite complicated. You need to show that the proportion 1 10 is correct for any single one of the numbers 0, 1, 2, . . . , 9.
10. 1 − (1 − p)(1 − p2)2 and 1 − (1 − p)(1 − p2)2 − p + p[1 − (1 − p)2]2.
9. If X and Y are the numbers of heads obtained, P(X = Y ) =X kP(X = k)P(Y = k) =X kP(X = k)P(Y = n − k)= P(X + Y = n).
1. Expand (1 + x)n + (1 − x)n.2. No.6. 79 140 and 40 61 .7. 11 50 .8. √3/(4πn)( 27 32 )n.
Show that (Yn : n ≥ 0) is also a simple, symmetric random walk.Let Mn = max{Xi : 0 ≤ i ≤ n}. Explain why {Mn ≥ a} = {Ta ≤ n} for a ≥ 1. By using the process (Yn : n ≥ 0) constructed above, show that, for a ≥ 1, P????Mn ≥a, Xn ≤ a − 1= P(Xn ≥ a + 1), and thus, P(Mn ≥a) =
For each integer a ≥ 1, let Ta = inf{n ≥ 0 : Xn = a}. Show that Ta is a stopping time.Define a random variable Yn by the rule Yn =(Xn if n < Ta, 2a − Xn if n ≥ Ta.
18. Let (Xn : n ≥ 0) be a simple, symmetric random walk on the integers {. . . ,−1, 0, 1, . . . }, with X0 = 0 and P????Xn+1 = i ± 1 Xn = i= 1 2 .
(b) Suppose the frog starts on pad k and stops when she returns to it. Show that the expected number of times the frog hops is e(k − 1)!, where e = 2.718 . . . . What is the expected number of times she will visit the lily pad k + 1?(Cambridge 2010)
(a) Find the equilibrium distribution of the corresponding Markov chain.
17. A frog inhabits a pond with an infinite number of lily pads, numbered 1, 2, 3 . . . . She hops from pad to pad in the following manner: if she happen to be on pad i at a given time, she hops to one of the pads (1, 2, . . . , i, i + 1) with equal probability.
12.13 Problems 249 earlier moves. Let Xn be her position after n moves. Show that (Xn : n ≥ 0) is a reversible Markov chain, and find its invariant distribution.What is the mean number of moves before she returns to her starting square?
16. An erratic bishop starts at the bottom left of a chess board and performs random moves. At each stage, she picks one of the available legal moves with equal probability, independently of
Show that the Hk satisfy a second-order difference equation, and hence find Hk . (Cambridge 2009)
Deduce the invariant distribution. (Oxford 2005)* 14. Consider a Markov chain with state space S = {0, 1, 2, . . . } and transition matrix given by pi, j =(qp j−i+1 for i ≥ 1 and j ≥ i − 1, qp j for i = 0 and j ≥ 0, and pi, j = 0 otherwise, where 0 < p = 1 − q < 1.For each p ∈ (0, 1),
If (πi : i = 1, 2, . . . , N) is the invariant distribution of the Markov chain X, show thatπ2 =1 − p p(N − 1)π1, π3 =1 − p p(N − 1)π2.
Number the positions on the shelf from 1 (at the left) to N (at the right). Write Xn for the position of the red book after n units of time. Show that X is a Markov chain, with non-zero transition probabilities given by:pi,i−1 = p for i = 2, 3, . . . , N, pi,i+1 =1 − p N − 1 for i = 1, 2, . .
13. Consider a collection of N books arranged in a line along a bookshelf. At successive units of time, a book is selected randomly from the collection. After the book has been consulted, it is replaced on the shelf one position to the left of its original position, with the book in that position
Show that, at least until the bottom card reaches the top, the ordering of the cards inserted beneath it is uniformly random. Hence or otherwise show that, for all n,|pn − p| ≤52(1 + log 51)n.(Cambridge 2003)248 Markov chains
Let pn denote the probability that after n iterations, the cards are found to be in increasing order from the top. Show that, irrespective of the initial ordering, pn converges as n → ∞, and determine the limit p. You should give precise statements of any general results to which you appeal.
12. Consider a pack of cards labelled 1, 2, . . . , 52. We repeatedly take the top card and insert it uniformly at random in one of the 52 possible places, that is, on the top or on the bottom or in one of the 50 places inside the pack. How long on average will it take for the bottom card to reach
11. Let i be a state of an irreducible, positive recurrent Markov chain X, and let Vn be the number of visits to i between times 1 and n. Let μ = Ei (Ti ) and σ 2 = Ei ([Ti −μ]2) be the mean and variance of the first return time to the starting state i , and assume 0 < σ 2 < ∞.Suppose X0 =
Show that Q is the transition matrix of a Markov chain which is reversible in equilibrium, and has invariant distribution equal to the mass function of Z.
10. Markov chain Monte Carlo. We wish to simulate a discrete random variable Z with mass function satisfying P(Z = i ) ∝ πi , for i ∈ S and S countable. Let X be an irreducible Markov chain with state space S and transition matrix P = (pi, j ). Let Q = (qi, j ) be given by qi, j
(c) the mean number of steps before its first visit to T .
(b) the mean number of visits to the opposite vertex T to S before its first return to S,
(a) the mean number of steps before it returns to its starting vertex S,
9. A particle performs a random walk about the eight vertices of a cube. Find
8. A special die is thrown repeatedly. Its special property is that, on each throw, the outcome is equally likely to be any of the five numbers that are different from the immediately previous number. If the first score is 1, find the probability that the (n + 1)th score is 1.
7. Let X be an irreducible, positive recurrent, aperiodic Markov chain with state space S. Show that X is reversible in equilibrium if and only if pi1,i2 pi2,i3 · · · pin−1,in pin ,i1 = pi1,in pin ,in−1 · · · pi2,i1 , for all finite sequences i1, i2, . . . , in ∈ S.
6. Each morning, a student takes one of three books (labelled 1, 2, and 3) from her shelf. She chooses book i with probability αi , and choices on successive days are independent. In the evening, she replaces the book at the left-hand end of the shelf. If pn denotes the probability that on day n
5. At each time n, a random number Sn of students enter the lecture room, where S0, S1, S2, . . .are independent and Poisson distributed with parameter λ. Each student remains in the room for a geometrically distributed time with parameter p, different times being independent. Let Xn be the number
(b) positive recurrent if and only if Pi bi < ∞, and write down the invariant distribution when the last condition holds.
a) recurrent if and only if bi →0 as i →∞,
4. Consider a Markov chain on the set S = {0, 1, 2, . . . } with transition probabilities pi,i+1 = ai , pi,0 = 1 − ai , where (ai : i ≥ 0) is a sequence of constants satisfying 0 < ai < 1 for all i . Let b0 = 1 and bi = a0a1 · · · ai−1 for i ≥ 1. Show that the chain is
3. We distribute N black balls and N white balls in two urns in such a way that each contains N balls. At each epoch of time, one ball is selected at random from each urn, and these two balls are interchanged. Let Xn be the number of black balls in the first urn after time n. Write down the
Classify the states of the chain. Suppose that 0 < αβ < 1. Find the n-step transition probabilities and show directly that they converge to the unique invariant distribution. For what values of α and β is the chain reversible in equilibrium?
Find its invariant distribution. Deduce that all states are positive recurrent and that, if the chain is aperiodic, then pi, j (n)→ 1/N as n → ∞.2. Let X be a discrete-time Markov chain with state space S = {1, 2} and transition matrix P =1 − α αβ 1 − β.
1. A transition matrix is called doubly stochastic if its column P sums equal 1, that is, if i∈S pi, j = 1 for j ∈ S.Suppose an irreducible chain with N (< ∞) states has a doubly stochastic transition matrix.
Theorem 12.123 Random walk on the finite connected graph G = (V, E) is an irreducible Markov chain with unique invariant distributionπv =d(v)2|E|for v ∈ V.The chain is reversible in equilibrium
Exercise 12.97 Consider the symmetric random walk on the line Z. Show that any invariant distributionπ satisfies πn = 1 2 (πn−1 + πn+1), and deduce that the walk is null recurrent.
12.4. At each step it moves to a neighbour of the current vertex chosen uniformly at random. Find the invariant distribution of the chain. Using the remark after Proposition 12.86 or otherwise, find the expected number of visits to B before the particle returns to A.
Exercise 12.96 A particle starts at A and executes a symmetric random walk on the graph of Figure
Showing 900 - 1000
of 3340
First
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
Last
Step by Step Answers