New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
elementary probability for applications
Probability An Introduction 2nd Edition Geoffrey Grimmett, Dominic Welsh - Solutions
(b) π = π P
Definition 12.82 Let X be a Markov chain with transition matrix P. The vector π =(πi : i ∈ S) is called an invariant distribution6 of the chain if:(a) πi ≥ 0 for all i ∈ S, and Pi∈S πi = 1,
Exercise 12.81 Let i be an aperiodic state of a Markov chain. Show that there exists N ≥ 1 such that pi,i (n) > 0 for all n ≥ N
Exercise 12.80 Let 0 < p < 1. Classify the states of the Markov chains with transition matrices0 p 0 1 − p 1 − p 0 p 0 0 1 − p 0 p p 0 1 − p 0,1 − 2p 2p 0 p 1 − 2p p 0 2 p 1 − 2p.
Exercise 12.79 Let X be an irreducible Markov chain with periodd. Show that Yn = Xnd defines an aperiodic Markov chain.
(d) State i is called ergodic if it is aperiodic and positive recurrent.
(c) The period di of the state i is given by5 di = gcd{n : pi,i (n) > 0}.The state i is called aperiodic if di = 1, and periodic if di > 1.
(b) If i is recurrent, we call it null if μi = ∞, and positive (or non-null) if μi < ∞.
(a) The mean recurrence time μi of the state i is defined byμi = Ei (Ti ) =∞X n=1 n fi,i (n) if i is recurrent,∞ if i is transient.
(b) Let L A be the time of the last visit of aMarkov chain to the set A, with the convention that L A = ∞ if infinitely many visits are made. Show that L A is not generally a stopping time.Definition 12.74
(a) Let H A be the hitting time of the set A. Show that T = H A − 1 is not generally a stopping time.
Exercise 12.62 Consider Exercise 12.61 with the difference that pi,i+1 = pi,i−1i + 1 iαfor i ≥ 1, where α > 0. Find the probability P0(Xn ≥ 1 for all n ≥ 1) in terms of α.
Show that P0(Xn ≥ 1 for all n ≥ 1) = 6/π2. You may use the fact that P∞k=1 k−2 = 1 6π2.
Exercise 12.61 Let X be aMarkov chain on the non-negative integers {0, 1, 2, . . . } with transition probabilities satisfying p0,1 = 1, pi,i+1 + pi,i−1 = 1, pi,i+1 = pi,i−1i + 1 i2 for i ≥ 1.
Exercise 12.52 In a variant of Exercise 12.51, the walker moves two steps rightwards with probability p, and otherwise one step leftwards. Show that the walk is recurrent if and only if p = 1
Exercise 12.52 In a variant of Exercise 12.51, the walker moves two steps rightwards with probability p, and otherwise one step leftwards. Show that the walk is recurrent if and only if p = 1
Exercise 12.51 Consider the asymmetric random walk on the line Z that moves one step rightwards with probability p, or one step leftwards with probability q (= 1 − p). Show that the walk is recurrent if and only if p = 1 2
Exercise 12.50 The infinite binary tree T is the tree-graph in which every vertex has exactly three neighbours. Show that a random walk on T is transient.
(a) Let j be a recurrent state of a Markov chain. Show that Pn pi, j (n) = ∞ for all states i such that i → j .(b) Let j be a transient state of a Markov chain. Show that Pn pi, j (n) < ∞for all states i
Exercise 12.42 A Markov chain X has an absorbing state s to which all other states lead. Show that all states except s are transient.Exercise 12.43
We saw earlier that simple random walk on the line is recurrent if and only it is unbiased(see Theorem 10.12). The proof used generating functions, and the method may be extended to prove Theorem 12.31
If a chain starts in state i , is it bound to return to i at some later time?Definition 12.30 A state i is called recurrent if Pi (Ti < ∞) = 1. A state is called transient if it is not recurrent.2 Here is a criterion for recurrence in terms of the transition matrix P and its powers.Theorem 12.31
The first-passage time to state j is defined as Tj = min{n ≥ 1 : Xn = j }, and the first-passage probabilities are given by fi, j (n) = Pi (Tj = n).
12.4 Recurrence and transience Let X be a homogeneousMarkov chain with state space S and transition matrix P. For reasons of economy of notation, we write henceforth Pi (A) for P(A | X0 = i ), and similarly Ei (Z)for the conditional mean E(Z | X0 = i ).
Exercise 12.29 If the state space is finite, show that there must exist at least one closed communicating class. Give an example of a transition matrix with no such class.214 Markov chains
Exercise 12.28 Find the communicating classes, and the closed communicating classes, when the transition matrix is P =1 2 0 0 0 1 20 1 2 0 1 2 0 0 0 1 0 0 0 1 41 41 41 41 2 0 0 0 1 2.It may be useful to draw a diagram.
Possible transitions of the chain are illustrated in Figure 12.1. The equivalence classes are C1 = {1, 2, 3}, C2 = {4}, and C3 = {5, 6}. The classes C1 and C2 are not closed, but C3 is closed. △
Exercise 12.12 A square matrix with non-negative entries is called doubly stochastic if all its row sums and column sums equal 1. If P is doubly stochastic, show that Pn is doubly stochastic for n ≥ 1 Let S = {1, 2, 3, 4, 5, 6} and P = 1 2 1 2 0 0 0 0 0 0 1 0 0 0
Exercise 12.11 Let X and Y be symmetric random walks on the line Z. Is X + Y necessarily a Markov chain? Explain.
Exercise 12.10 Let Xn be the greatest number shown in the first n throws of a fair six-sided die. Show that X = (Xn : n ≥ 1) is a homogeneous Markov chain, and write down its transition probabilities.
(c) Starting from time 0, passengers arrive on platform 93 4 at King’s Cross station, with constant rate λ > 0, in order to catch a train due to depart at time t > 0. Using the above results, or otherwise, find the expected total time waited by all passengers (the sum of the passengers’
(b) Let n ≥ 1. Compute the joint probability density function of (J1, J2, . . . , Jn) given{Nt = n}. Deduce that, given {Nt = n}, (J1, J2, . . . , Jn) has the same distribution as the non-decreasing rearrangement of n independent uniform random variables on [0, t ].
22. (a) Give the definition of a Poisson process N = (Nt : t ≥ 0) with rate λ, using its transition rates. Show that, for each t ≥ 0, the distribution of Nt is Poisson with a parameter to be specified.Let J0 = 0 and let J1, J2, . . . denote the jump times of N. What is the distribution of
16 (4λt − 8λte−2λt − e−4λt + 1).Deduce that M(t) is not a doubly stochastic Poisson process.(Cambridge 2011)
4 (2λt + e−2λt − 1), var(M(t )) = 1
(d) Now consider the process M(t ) obtained by deleting every odd-numbered point in an ordinary Poisson process with rate λ. Check that E(M(t)) = 1
(c) Suppose that 3 = (3(t ) : t ≥ 0) is a non-negative, real-valued random process. Conditional on 3, let N = (N(t) : t ≥ 0) be an inhomogeneous Poisson process with rate function 3(u). Such a process N is called a doubly stochastic Poisson process. Show that the variance of N(t ) cannot be
(b) Show that the number of arrivals in an inhomogeneous Poisson process during the interval(0, t ) has the Poisson distribution with mean R t 0 λ(u) du.
21. (a) Define an inhomogeneous Poisson process with rate function λ(u).
where λm = αm + β, m ≥ 1, and α, β > 0. Show, subject to suitable extra conditions, that pm(t) = P(M(t ) = m) satisfies a set of differential–difference equations to be specified. Deduce without solving these equations in their entirety that M(t ) has meanβ(eαt − 1)/α, and find the
(b) The fire alarm in Clarkson Road is different. The number M(t) of alarms by time t is such that P????M(t + h) = m + 1 M(t) = m= λmh + o(h),
Deduce that N(t ) has the Poisson distribution with parameter 3(t) =R 1 0 λ(u) du.
20. (a) The fire alarm in Mill Lane is set off at random times. The probability of an alarm during the time interval (u, u +h) is λ(u)h +o(h), where the ‘intensity function’ λ(u) may vary with the time u. Let N(t ) be the number of alarms by time t , and set N(0) = 0. Show, subject to
2 )k and otherwise leaves, never to return. The service times of customers who enter the shop are random variables with the exponential distribution, parameter μ. If Qt is the number of people within the shop (excluding the single server) at time t , show that pk (t ) = P(Qt = k) satisfies p′k
If a customer arrives in the queue at time t , find the moment generating function of the total time which it spends in the system, including its service time. Deduce that this time has the exponential distribution with parameter μ(1 − ρ).19. Customers arrive at the door of a shop according to
P(Q0 = k) = (1 − ρ)ρk for k = 0, 1, 2, . . . , where ρ = λ/μ, so that the queue is ‘in equilibrium’ by the conclusion of Exercise 11.78.
18. Customers arrive in a queue according to a Poisson process with rate λ, and their service times have the exponential distribution with parameter μ, where λ < μ.We suppose that the number Q0 of customers in the system at time 0 has distribution
17. Customers arrive in a queue according to a Poisson process with rate λ, and their service times have the exponential distribution with parameter μ. Show that, if there is only one customer in the queue, then the probability that the next customer arrives within time t and has to wait for
16. Prove that, in a queue whose input is a Poisson process and whose service times have the exponential distribution, the number of new arrivals during any given service time is a random variable with the geometric distribution.
If all the robots are idle at time 0, show that the number of idle robots at time t has the binomial distribution with parameters N and e−λt cosh λt .
If there are N robots operating independently according to the above laws and pk (t ) is the probability that exactly k are idle at time t , show that p′k (t ) = λ(N − k + 1) pk−1(t ) − λNpk (t ) + λ(k + 1) pk+1(t ), for k = 0, 1, . . . , N, subject to the rule that p−1(t ) = pN+1(t) =
Let q(t ) be the probability that the robot is working at time t given that it was working at time 0. Find q(t), and find the distribution of the earliest time T at which there is a change of state.
15. A robot can be in either of two states: state A (idle) and state B (working). In any short time interval (t , t + h), the probability that it changes its state is λh +o(h), where λ > 0. If p(t) is the probability that it is idle at time t given that it was idle at time 0, show that p′(t) =
14. In the immigration–death process of Problem 11.7.13, show that there is a steady-state distribution(in the jargon of Section 11.6) for all positive values of θ and μ. Show further that this distribution is the Poisson distribution with parameter θ/μ.
11.7 Problems 203
subject to the boundary condition G(s, 0) = s I . Solve this equation to find that G(s, t ) =1 + (s − 1)e−μt I expθ(s − 1)(1 − e−μt )/μ.
Deduce that the probability generating function G(s, t) = E(sDt ) satisfies the partial differential equation∂G∂t = (s − 1)θG − μ∂G∂s
13. The ‘immigration–death’ process is obtained from the birth–death–immigration process of Problem11.7.11 by setting the birth rate λ equal to 0. Let D = (Dt : t ≥ 0) be an immigration–death process with positive immigration rate θ and death rate μ. Suppose that D0 = I , and set
In the particular case when λ = μ = θ = 1 and the population is empty at time t = 0, show that the size of the population at time t has mean t , and calculate its variance. (Oxford 1964F)12. Consider the ‘birth–death–immigration’ process of Problem 11.7.11 and suppose thatλ,μ, θ > 0.
(c) subpopulations descending from distinct individuals develop independently.If pn(t ) denotes the probability that the population consists of n individuals at time t , show thatφ(z, t) =∞X n=0 zn pn(t )satisfies the partial differential equation∂φ∂t = (λz − μ)(z − 1)∂φ∂z + θ(z
(b) in the interval (t, t + dt ), there is a probability θdt + o(dt ) that a single immigrant will join the population,
(a) during the interval (t, t + dt ), an individual existing at time t has (independently of its previous history) probability λdt +o(dt ) of having a single offspring (twins, triplets, etc, being impossible) and a probability μdt + o(dt ) of dying, where λ and μ are absolute constants,
11. A population develops according to the following rules:
10. Consider a birth–death process whose birth and death rates satisfy λ = μ. If the initial population size is I , show that the time T until the extinction of the process has distribution function P(T ≤ t ) =λtλt + 1J for t > 0, and deduce that, as I → ∞, the random variable UI =
MJ (t)→ M(t) if t < 1.[You may use the fact that J tŴ(J − t)/Ŵ(J )→ 1 as J → ∞.] It follows that the distribution of UJ approaches the extreme-value distribution as J → ∞.202 Random processes in continuous time
Show that UJ has moment generating function MJ (t ) =1 J t JY−1 i=1i i − tif t < 1, and deduce that, as J → ∞,
Let TJ be the time required by a simple birth process with birth rate λ to grow from size 1 to size J , and let UJ = λTJ − log J.
9. Show that the moment generating function of the so-called ‘extreme-value’ distribution with density function f (x) = exp(−x − e−x ) for x ∈ R, is M(t ) = Ŵ(1 − t ) if t < 1.
8. In a simple birth process with birth rate λ, find the moment generating function of the time required by the process to grow from size I to size J (> I ).
7. A ‘doubly stochastic’ Poisson process is an inhomogenous Poisson process in which the rate function λ(t ) is itself a random process. Show that the simple birth process with birth rate λ is a doubly stochastic Poisson process N for which λ(t ) = λNt .
This is an example of a so-called ‘inhomogeneous’ Poisson process.
Let T be the time of occurrence of the first failure. Find the probability density function of T and show that, if λ(t ) = c/(1 + t) where c > 0, the expected value of T is finite if and only if c > 1. (Oxford 1981F)
Prove that the number of failures in (0, t) has the Poisson distribution with mean R t 0 λ(x) dx.
This is an example of a so-called ‘compound’ Poisson process.6. The probability of one failure in a system occurring in the time interval (t, t+τ ) is λ(t )τ+o(τ ), independently of previous failures, and the probability of more than one failure in this interval is o(τ ), where λ is a
5. Tourist coaches arrive at Buckingham Palace in the manner of a Poisson process with rateλ, and the numbers of tourists in the coaches are independent random variables, each having probability generating function G(s). Show that the total number of tourists who have arrived at the palace by time
Exercise 11.79 A queue has three servers A1, A2, A3, and their service times are independent random variables, Ai ’s service times having the exponential distribution with parameter μi . An arriving customer finds all three servers unoccupied and chooses one at random, each being equally likely.
Exercise 11.78 Show that pk (t) = (1 − ρ)ρk is a solution to equations (11.66)–(11.67) so long asρ = λ/μ < 1. This proves that, if the process begins in its steady-state distribution, then it has this distribution for all time.
Exercise 11.77 If Q is the above queueing process with arrival rate λ and service rate μ, and a customer arrives to find exactly k customers waiting ahead (including the person being served), show that this customer leaves the queueing system after a length of time which has the gamma
Exercise 11.61 Let L be a birth–death process with birth rate 1 and death rate 1. Suppose that L0 is a random variable having the Poisson distribution with parameter α. Show that the probability that the process is extinct by time t is exp[−α/(t + 1)]If λ < μ, the queue has a unique
Exercise 11.60 A birth–death process L has birth rate λ and death rate μ. If the population has size k at time t , show that the subsequent length of time which elapses before there is either a birth or a death is a random variable having the exponential distribution with parameter (λ + μ)k.
Exercise 11.59 Let m(t ) be the expected size at time t of the population in a birth–death process with birth rate λ and death rate μ. Use (11.55) to show that m satisfies the differential equation m′(t ) = (λ − μ)m(t ).Hence find m(t) in terms of the initial size of the population.
Exercise 11.31 If Ti is the time of the i th arrival in the Poisson process N, show that Nt < k if and only if Tk > t . Use Theorem 11.23 and the central limit theorem, Theorem 8.25, to deduce that, as t → ∞, PNt − λt√λt ≤ u→Z u−∞1√2πe−1 2 v2 dv for u ∈ R.
Exercise 11.30 Let M and N be independent Poisson processes, M having rate λ and N having rate μ.Use the result of Problem 6.9.4 to show that the process M + N = (Mt + Nt : t ≥ 0) is a Poisson process with rate λ + μ. Compare this method with that of Exercise 11.5.
This is the central limit theorem for a Poisson process
Exercise 11.20 If N is a Poisson process with rate λ, show that the moment generating function of Ut =Nt − E(Nt )√var(Nt )is Mt (x) = E(exUt ) = exp−x√λt + λt (ex/√λt − 1).Deduce that, as t →∞, P(Ut ≤ u) →Z u−∞1√2πe−1 2 v2 dv for u ∈ R.
Exercise 11.19 If N is a Poisson process with rate λ, show that, for t > 0, P(Nt is even) = e−λt cosh λt, P(Nt is odd) = e−λt sinh λt .
Exercise 11.18 If N is a Poisson process with rate λ, show that var(Nt /t) → 0 as t → ∞.
Exercise 11.5 (Superposition) Two independent streams of telephone calls arrive at the exchange, the first being a Poisson process with rate λ and the second being a Poisson process with rate μ. Show that the combined stream of calls is a Poisson process with rate λ + μ.
Exercise 11.4 (Thinning) Suppose that telephone calls arrive at the exchange in the manner of a Poisson process N = (Nt : t ≥ 0) with rate λ, and suppose that the equipment is faulty so that each incoming call fails to be recorded with probability q (independently of all other calls). If N′t
Solve this equation to find p(t ).
Exercise 11.3 If N is a Poisson process with rate λ, show that P(Nt+h = 0) =1 − λh + o(h)P(Nt = 0)for small positive values of h. Hence, show that p(t) = P(Nt = 0) satisfies the differential equation p′(t) = −λp(t ).
(b) what can be said about the distribution of the sequence T0, T1, . . . of times at which calls arrive?
(a) what is the mass function of Nt for a given value of t?
(c) the incidence of deaths in a small town with a reasonably stable population (neglecting seasonal variations).
(b) the clicks emitted by a Geiger counter as it records the detection of radioactive particles,
(a) the arrival of customers in a shop,
P(X = k) = (1 − α)αk−1 for k = 1, 2, . . . , where α satisfies 0 < α < 1. A new light bulb is inserted in its socket on day 0. Show that the probability that the bulb has to be changed on day n is 1 − α, independently of n.
15. The university buys light bulbs which have random lifetimes. If the bulb in my office fails on day n, then it is replaced by a new bulb which lasts for a random number of days, after which it is changed, and so on. We assume that the lifetimes of the bulbs are independent random variables X1,
We call η recurrent if f = 1 and transient if f < 1. Let un be the probability that η occurs at time n. Show that the generating functions F(s) =∞X n=1 fnsn, U(s) =∞X n=0 unsn are related by U(s) = U(s)F(s) + 1, and deduce that η is recurrent if and only if Pn un = ∞.
14. The generating-function argument used to prove Theorem 10.12 has a powerful application to the general theory of ‘recurrent events’. Let η be an event which may or may not happen at each of the time points 0, 1, 2, . . . (η may be the visit of a random walk to its starting point, or a
Showing 1000 - 1100
of 3340
First
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
Last
Step by Step Answers