New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
probability and stochastic modeling
Stochastic Processes 2nd Edition Sheldon M. Ross - Solutions
Let X denote the number of successes in n independent Bernoulli trials, with trial resulting in a success with probability p. Give an upper bound for P P{|x p| a}.
Let X denote the number of heads in n independent flips of a fair coin. Show that: (a) P{Xn/2 a} < exp{-2a/n} (b) P{X-n/2-a} exp{-2a/n}.
Show that the equation. e - e-B = 2e 12 has no solution when + 0. (Hint. Expand in a power series.)
Let Z = IX,, where X,, i 1 are independent random variables with 1=1 P{X, = 2} = P{X, = 0} = 1/2 Let N = Min{n. Z = 0}. Is the martingale stopping theorem applicable? If so, what would you conclude? If not, why not?
In Example 6.2(C), find the expected number of stages until one of the players is eliminated. n
Consider a gambler who at each gamble is equally likely to either win or lose 1 unit. Suppose the gambler will quit playing when his winnings are either A or -B, A > 0, B > 0. Use an appropriate martingale to show that the expected number of bets is AB
Consider successive flips of a coin having probability p of landing heads. Use a martingale argument to compute the expected number of flips until the following sequences appear: (a) HHTTHHT (b) HTHTHTH
A process {Z,, n 1} is said to be a reverse, or backwards, martingale if EZ, for all n and EZ Z+Z+2 ] = Z+1 Show that if X,,i>1, are independent and identically distributed random variables with finite expectation, then Z, = (X + + x)/n, n 1, is a reverse martingale.
If {X, n 0} and {Y,, n 0} are independent martingales, is {Zn, n 0} a martingale when (a) Z, XY? (b) Zn = X,Y,? Are these results true without the independence assumption? In each case either present a proof or give a counterexample.
Let X,... be a sequence of independent and identically distributed random variables with mean 0 and variance o. Let S = X, and show that {Z,, n 1} is a martingale when Z = S - no =1
Let X(n) denote the size of the nth generation of a branching process, and let o denote the probability that such a process, starting with a single individual, eventually goes extinct. Show that {*(), n = 0} is a martingale
Consider a Markov chain {X,, n 0} with PNN 0 Let P(i) denote the probability that this chain eventually enters state N given that it starts in state i Show that {P(X), n = 0} is a martingale.
Consider the Markov chain which at each transition either goes up 1 with probability p or down 1 with probability q = 1 p. Argue that (q/p), n 1, is a martingale
Verify that X,/m", n 1, is a martingale when X, is the size of the nth generation of a branching process whose mean number of offspring per individual is m
For a martingale {Z, n 1}, let X, Z, Z-1, i 1, where Z = 0 Show that Var(Z) = Var(X,) i=1
If {Z, n 1} is a martingale show that, for 1 k < n, E[ZZZ]= Zk = -
Let Y, X) X-1), i = 1,..., n + 1 where X(0) = 0, X(n + 1) = t, and X) = X(2) X() are the ordered values of a set of n independent uniform (0, 1) random variables. Argue that P{Y, y,, i = 1,.., n+1} is a symmetric function of y, y
Consider the two-state Markov chain of Example 5.8(A), with X(0) = 0 (a) Compute Cov(X(s), X(y)) (b) Let S,(t) denote the occupation time of state 0 by t. Use (a) and (5.84) to compute Var(S(t)). = -
Consider a renewal process whose interarrival distribution F is a mixture of two exponentials. That is, F(x) = pe* + qe,q=1-p Compute the renewal function E[N(t)]. Hint Imagine that at each renewal a coin, having probability p of landing heads, is flipped If heads appears, the next interarrival is
Consider an ergodic continuous-time Markov chain, with transition rates 9.;, in steady state. Let P,, j 0, denote the stationary probabilities. Suppose the state space is partitioned into two subsets B and B = G. (a) Compute the probability that the process is in state i, i B, given that it is in B
The work in a queueing system at any time is defined as the sum of the remaining service times of all customers in the system at that time. For the M/G/1 in steady state compute the mean and variance of the work in the system.
(a) Prove that a stationary Markov process is reversible if, and only if, its transition rates satisfy 9(1)(2) (-1) (j) = q(in)9 (in n-1) 9(i,j)9(i,j) ,jn for any finite sequence of states j, j, (b) Argue that it suffices to verify that the equality in (a) holds for sequences of distinct states(c)
Consider an M/M/ queue with channels (servers) numbered 1, 2, On arrival, a customer will choose the lowest numbered channel that is free. Thus, we can think of all arrivals as occurring at channel 1 Those who find channel 1 busy overflow and become arrivals at channel 2 Those finding both channels
Consider a time-reversible continuous-time Markov chain having param- eters v., P., and having limiting probabilities P,, j 0 Choose some state say state 0-and consider the new Markov chain, which makes state 0 an absorbing state That is, reset v to equal 0 Suppose now at time points chosen
N customers move about among r servers The service times at server i are exponential at rate , and when a customer leaves server i it joins the queue (if there are any waiting-or else it enters service) at server j,ji, with probability 1/(r 1) Let the state be (n,, ..., n,) when there are n,
Complete the proof of the conjecture in the queueing network model of Section 5.7.1.
In the stochastic population model of Section 5.6 2: (a) Show that == P(n)q(n, D,n) = P(D,n)q(D,n,n)when P(n) is as given by (564) witha, = (/jv)(v/)'. (b) Let D(t) denote the number of families that die out in (0, 1). Assum- ing that the process is in steady state 0 at time = 0, what type of
What can you say about the departure process of the stationary M/M/1 queue having finite capacity?
Consider two M/M/1 queues with respective parameters A,, ,, where A,,, 1, 2. Suppose they both share the same waiting room, which has finite capacity N. (That is, whenever this room is full all potential arrivals to either queue are lost.) Compute the limiting probability that there will be n
If {X(1), 10} and {Y(1), 1 0} are independent time-reversible Markov chains, show that the process {(x(t), Y(1), 1 0} is also.
Find the limiting probabilities for the M/M/s system and determine the condition needed for these to exist.
A small barbershop, operated by a single barber, has room for at most two customers. Potential customers arrive at a Poisson rate of three per hour, and the successive service times are independent exponential random variables with mean hour (a) What is the average number of customers in the shop?
Each individual in a biological population is assumed to give birth at an exponential rate A and to die at an exponential rate . In addition, there is an exponential rate of increase due to immigration. However, immigration is not allowed when the population size is N or larger (a) Set this up as a
Consider a continuous-time Markov chain with X(0) = 0. Let A denote a set of states that does not include 0 and set T = Min{t > 0. X(t) = A} Suppose that T is finite with probability 1 Set q, 9.,, and consider the random variable H = , qx(t) dt, called the random hazard. (a) Find the hazard rate
Let A be a specified set of states of a continuous-time Markov chain and let T,(t) denote the amount of time spent in A during the time interval [0, 1] given that the chain begins in state i Let Y, ..., Y, be independent exponential random variables with mean A. Suppose the Y, are independent of
In Example 5.4(D), find the variance of the number of males in the population at time t.
Consider a population in which each individual independently gives birth at an exponential rate A and dies at an exponential rate In addition, new members enter the population in accordance with a Pois- son process with rate 0 Let X(t) denote the population size at time t. (a) What type of process
Consider a population of size n, some of whom are infected with a certain virus Suppose that in an interval of length h any specified pair of individuals will independently interact with probability Ah + o(h) If exactly one of the individuals involved in the interaction is infected then the other
Consider a birth and death process with birth rates {A,} and death rates {}. Starting in state i, find the probability that the first k events are all births
Suppose that the "state" of the system can be modeled as a two-state continuous-time Markov chain with transition rates v = , v = When the state of the system is i, "events" occur in accordance with a Poisson process with rate a,, i = 0, 1 Let N(t) denote the number of events in (0, t) (a) Find
For the Yule process. (a) verify that satisfies the forward and backward equations (b) Suppose that X(0) = 1 and that at time 7 the process stops and is replaced by an emigration process in which departures occur in a Poisson process of rate . Let 7 denote the time taken after T for the population
Let P(t) Poo(1). (a) Find lim 1-P(1) 1-0 t (b) Show that - P(t)P(s) P(ts) 1 P(s) + P(s)P(t) (c) Show - |P(t) P(s) 1 - P(t s), and conclude that P is continuous. s
Show that a continuous-time Markov chain is regular, given (a) that v,
Suppose that a one-celled organism can be in one of two stateseither A or B An individual in state A will change to state B at an exponential ratea, an individual in state B divides into two new individuals of type A at an exponential rate Define an appropriate continuous-time Markov chain for a
A population of organisms consists of both male and female members. In a small colony any particular male is likely to mate with any particular female in any time interval of length h, with probability Ah + o(h). Each mating immediately produces one offspring, equally likely to be male or female
A taxi alternates between three locations When it reaches location 1 it is equally likely to go next to either 2 or 3 When it reaches 2 it will next go to 1 with probability and to 3 with probability 3. From 3 it always goes to 1 The mean times between locations i and j are t 20, 1330, 123 = 30
For an ergodic semi-Markov process derive an expression, as t o, for the limiting conditional probability that the next state visited after t is state j, given X(t) = i
For an ergodic semi-Markov process. (a) Compute the rate at which the process makes a transition from i into j. (b) Show that , , = 11. (c) Show that the proportion of time that the process is in state i and headed for state j is P., n., /,, where n,, = SF.,(t) dt (d) Show that the proportion of
M balls are initially distributed among m urns. At each stage one of the balls is selected at random, taken from whichever urn it is in, and placed, at random, in one of the other m - 1 urns Consider the Markov chain whose state at any time is the vector (n,, nm), where n, denotes the number of
Let {X,, n 1} denote an irreducible Markov chain having a countable state space. Now consider a new stochastic process {Y,, n 0} that only accepts values of the Markov chain that are between 0 and N That is, we define Y, to be the nth value of the Markov chain that is between O and N For instance,
Show that a finite state, ergodic Markov chain such that P,, > 0 for all ij is time reversible if, and only if, PPP = PPP for all i, j, k
Consider a time-reversible Markov chain with transition probabilities P,, and limiting probabilities #,, and now consider the same chain trun- cated to the states 0, 1, ., M. That is, for the truncated chain its transition probabilities P, are(P + Pk 0i M,j=i " k>M P = P (0, 0ij M otherwise. Show
Consider the list model presented in Example 4.7(D) Under the one- closer rule show, by using time reversibility, that the limiting probability that element j precedes element i-call it P{j precedes i}-is such that P{j precedes i}> P P P +P, when P, > P..
Consider the Markov chain with states 0, 1, ..,n and with transition probabilities Poi Pan-1 1 Pp=1-P-1 i=1,...,n-1. Show that this Markov chain is of the type considered in Proposition 4.7 1 and find its stationary probabilities.
A particle moves among n locations that are arranged in a circle (with the neighbors of location n being n - 1 and 1). At each step, it moves one position either in the clockwise position with probability p or in the counterclockwise position with probability 1 - p (a) Find the transition
Let {X, n 0} be a Markov chain with stationary probabilities #,, j 0. Suppose that X = i and define T = Min{n: n > 0 and X = i}. Let YX, 0, 1, ., T. Argue that {Y,,j = 0, ., T} is distributed as the states of the reverse Markov chain (with transition probabilities PP/) starting in state 0 until it
Find the transition probabilities for the Markov chain of Example 4.3(D) and show that it is time reversible.
Suppose in Example 4.7(B) that if the Markov chain is in state i and the random variable distributed according to q, takes on the value j, then the next state is set equal to j with probability a,/(a; + a,) and equal to i otherwise. Show that the limiting probabilities for this chain are , = a,/a,.
For any infinite sequence x1, x2, we say that a new long run begins each time the sequence changes direction. That is, if the sequence starts 5, 2, 4, 5, 6, 9, 3, 4, then there are three long runs-namely, (5, 2), (4, 5, 6, 9), and (3, 4) Let X, X2,... be independent uniform (0, 1) random variables
For the Markov chain model of Section 4 6.1, namely, j=1,.,-1, i>1, suppose that the initial state is N = where nm. Show that when n, m, and nm are large the number of steps to reach 1 from state N has approximately a Poisson distribution with mean m [clog clog - 1 + log(c - 1 + log(c 1)], where
Consider a branching process in which the number of offspring per individual has a Poisson distribution with mean A, A> 1 Let , denote the probability that, starting with a single individual, the population eventually becomes extinct Also, leta, a < 1, be such that (a) Show that a = A. ae = Ae (b)
Consider a simple random walk on the integer points in which at each step a particle moves one step in the positive direction with probability p, one step in the negative direction with probability p, and remains in the same place with probability q = 1 - 2p (0 Suppose that, instead of starting
A spider hunting a fly moves between locations 1 and 2 according to a 0.7 0.3 Markov chain with transition matrix starting in location 1. 0.3 0.7 The fly, unaware of the spider, starts in location 2 and moves according to a Markov chain with transition matrix 0.4 0.6 060.4. The spider catches the
Suppose that two independent sequences X1, X2, ... and Y, Y,... are coming in from some laboratory and that they represent Bernoulli trials with unknown success probabilities P, and P2. That is, P{X, = 1} = 1 P{X, 0} P, P{Y, = 1} = 1 - P{Y, = 0} = P2, and all random variables are independent To
Each day one of n possible elements is requested, the ith one with probability P., i1,2" P, = 1 These elements are at all times arranged in an ordered list that is revised as follows, the element selected is moved to the front of the list with the relative positions of all the other elements
In Problem 4.27, find the expected number of additional steps it takes to return to the initial position after all nodes have been visited.
Consider a particle that moves along a set of m + 1 nodes, labeled 0, 1,.,m. At each move it either goes one step in the clockwise direction with probability p or one step in the counterclockwise direction with probability 1p. It continues moving until all the nodes 1, 2, ..., m have been visited
Consider the Markov chain with states 0, 1,.,n and transition prob- abilities Po = 1 = Pan-1, Pp 1 P,-1, 0 Starting at state 0, say that an excursion ends when the chain either returns to 0 or reaches state n Let X, denote the number of transitions in the jth excursion (that is, the one that begins
Consider the gambler's ruin problem with N = 6 and p in state 3, determine (a) the expected number of visits to state 5 (b) the expected number of visits to state 1. = 7. Starting (c) the expected number of visits to state 5 in the first 7 transitions. (d) the probability of ever visting state 1.
Let T = {1,..., t} denote the transient states of a Markov chain, and let be, as in Section 44, the matrix of transition probabilities from states in 7 to states in T. Let m,,(n) denote the expected amount of time spent in state j during the first n transitions given that the chain begins in state
In the gambler's ruin problem show that P{she wins the next gamble present fortune is i, she eventually reaches N} [p[1-(q/p) [1-(q/p)'] =(i + 1)/2i if p if p =
Compute the expected number of plays, starting in i, in the gambler's ruin problem, until the gambler reaches either 0 or N.
Consider a Markov chain with states 0, 1, 2, ... and such that P+ p. 1 P- -where po 1. Find the necessary and sufficient condition on the p,'s for this chain to be positive recurrent, and compute the limiting probabilities in this case.
Consider a recurrent Markov chain starting in state 0. Let m, denote the expected number of time periods it spends in state i before returning to 0. Use Wald's equation to show that = " j>0 m = 1. Now give a second proof that assumes the chain is positive recurrent and relates the m, to the
Let,,j=0, be the stationary probabilities for a specified Markov chain. (a) Complete the following statement: 7,P., is the proportion of all transitions that.... Let A denote a set of states and let A' denote the remaining states. (b) Finish the following statement: P, is the proportion of all
Jobs arrive at a processing center in accordance with a Poisson process with rate A. However, the center has waiting space for only N jobs and so an arriving job finding N others waiting goes away. At most 1 job per day can be processed, and the processing of this job must start at the beginning of
Consider a positive recurrent irreducible periodic Markov chain and let , denote the long-run proportion of time in state j, j 0. Prove that ,,j=0, satisfy , , , P,, ,, = 1.
An individual possesses umbrellas, which she employs in going from her home to office and vice versa If she is at home (the office) at the beginning (end) of a day and it is raining, then she will take an umbrella with her to the office (home), provided there is one to be taken. If it is not
In the M/G/1 system (Example 4.3(A)) suppose that p < 1 and thus the stationary probabilities exist. Compute '(s) and find, by taking the limit as s 1,
If f,1 and f,,
At the beginning of every time period, each of N individuals is in one of three possible conditions: infectious, infected but not infectious, or noninfected. If a noninfected individual becomes infected during a time period then he or she will be in an infectious condition during the following time
For a Markov chain {X,, n 0}, show that = = P{X|X,,, for all jk} = P{X = ixxx-1=ix-1,X+1 = x+1}
Let X1, X2,... be independent random variables such that P{X, = j} = a,,j0. Say that a record occurs at time n if X, > max(X,,..., X-1), = where X-, and if a record does occur at time n call X, the record value. Let R, denote the ith record value. (a) Argue that {R,, i 1} is a Markov chain and
For a Markov chain (X,, n 0}, show that = = P{XX,,, for all jk} = P{X = ixxx-1=ix-1, xk+1 = ik+1}
Let X1, X2,... be independent random variables such that P{X, = j} = a,,j0. Say that a record occurs at time n if X, > max(X,,...,xn1), where X, and if a record does occur at time n call X, the record value. Let R, denote the ith record value. (a) Argue that {R,, i 1} is a Markov chain and compute
For the symmetric random walk starting at 0: (a) What is the expected time to return to 0?(b) Let N, denote the number of returns by time n. Show that E[N2] = (2n+1) D) (2") ()" - 1 2n 1. (c) Use (b) and Stirling's approximation to show that for n large E[N] is proportional to Vn.
Show that the symmetric random walk is recurrent in two dimensions and transient in three dimensions.
For states i, j, k, kj, let P=P{X=j,X, k,l = 1,. .,n-1|X = i}. (a) Explain in words what P (b) Prove that, for ij, P = represents PP. 11/1 k=0
A store that stocks a certain commodity uses the following (s, S) ordering policy; if its supply at the beginning of a time period is x, then it orders 0 if xs, S-x if x
Suppose that events occur in accordance with a Poisson process with rate A, and that an event occurring at time s, independent of the past, contributes a random amount having distribution F,, s 0. Show that W, the sum of all contributions by time t, is a compound Poisson random variable That is,
The number of trials to be performed is a Poisson random variable with mean Each trial has n possible outcomes and, independent of everything else, results in outcome number i with probability P., P=1 Let X, denote the number of outcomes that occur exactly j times, j = 0, 1, Compute E(X,], Var(X,).
Consider a star graph consisting of a central vertex and r rays, with one ray consisting of m vertices and the other 1 all consisting of n vertices. Let P, denote the probability that the leaf on the ray of m vertices is the last leaf visited by a particle that starts at 0 and at each step is
Suppose that r = 3 in Example 1 9(C) and find the probability that the leaf on the ray of size n, is the last leaf to be visited.
A particle moves along the following graph so that at each step it is equally likely to move to any of its neighbors 2 (n-1) Starting at 0 show that the expected number of steps it takes to reach n is n. (Hint Let T, denote the number of steps it takes to go from vertex i - 1 to vertex i, i =
In Example 19(A), determine the expected number of steps until all the states 1, 2, .., m are visited (Hint: Let X, denote the number of additional steps after i of these states have been visited until a total of i + 1 of them have been visited, i = 0, 1,..., m 1, and make use of Problem 125.)
Let X1, X2,... be a sequence of independent and identically distributed continuous random variables Say that a peak occurs at time n if X-1 X+1 Argue that the proportion of time that a peak occurs is, with probability 1, equal to 1/3.
Use Jensen's inequality to prove that the arithmetic mean is at least as large as the geometric mean. That is, for nonnegative x,, show that x In = (x)".
Let X be a random variable with probability density function f(x), and let M(t) =E[e] be its moment generating function The tilted density functionf, is defined by f(x) = e"f(x) M(t) Let X, have density function f (a) Show that for any function h(x) = E[h(X)] M(t)E[exp{-tx,}h(X)] (b) Show that, for
If X, and X2 are independent nonnegative continuous random variables, show that P{X, X min(X1, X) = 1} = where A,(t) is the failure rate function of X,. A(t) A(t) + A(t)'
Derive the distribution of the ith record value for an arbitrary continuous distribution F (see Example 16(B))
Showing 5500 - 5600
of 6914
First
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
Last
Step by Step Answers