New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
probability and stochastic modeling
An Introduction To Stochastic Modeling 3rd Edition Samuel Karlin, Howard M. Taylor - Solutions
3.4. Determine the communicating classes and period for each state of the Markov chain whose transition probability matrix is 0 1 2 3 4 5 0|| 0 0 0 0 100 1000 2000 100 30000 10 400000 1 500 110 1
3.3. A Markov chain on states 10, 1, 2, 3, 4, 5} has transition probability matricesFind all communicating classes. (b) 1|| 0 0 0 0 0 10 0 0 1 0 0 00
3.2. Which states are transient and which are recurrent in the Markov chain whose transition probability matrix is 0 1 2 3 4 5 00 0 0 0 10 00 20000 10 ? 34 00 0 400 0 0 1 0 0 1 0 0 0 50 0 0 0 0 0 1
3.1. A Markov chain has a transition probability matrixFor which integers n = 1, 2, ..., 20 is it true that What is the period of the Markov chain?Hint: One need not compute the actual probabilities. See Section 1.1. 0 1 2 3 4 567 0 0 1 0 0 0 0 0 0 100 100 000 2 000 10 000 3 0 0 0 0 1 000 4 0.5 0 0
2.8. An airline reservation system has a single computer, which breaks down on any given day with probability p. It takes two days to restore a failed computer to normal service. Form a Markov chain by taking as states the pairs (x, y), where x is the number of machines in operating condition at
2.7. Customers arrive for service and take their place in a waiting line.There is a single service facility, and a customer undergoing service at the beginning of a period will complete service and depart at the end of the period with probability (3 and will continue service into the next period
2.6. Consider a computer system that fails on a given day with probability p and remains "up" with probability q = 1 - p. Suppose the repair time is a random variable N having the probability mass function p(k) =/3(1 fork= 1,2,...,where0and a = 1 - /3. Determine the long run probability that the
2.5. Suppose that the weather on any day depends on the weather conditions during the previous two days. We form a Markov chain with the following states:State (S, S) if it was sunny both today and yesterday, State (S, C) if it was sunny yesterday but cloudy today, State (C, S) if it was cloudy
2.4. A component of a computer has an active life, measured in discrete units, that is a random variable , whereSuppose that one starts with a fresh component, and each component is replaced by a new component upon failure. Let X,, be the remaining life of the component in service at the end of
2.3. Suppose that a production process changes state according to a Markov process whose transition probability matrix is given by(a) Determine the limiting distribution for the process.(b) Suppose that states 0 and 1 are "in-control," while states 2 and 3 are deemed "out-of-control." In the long
2.2. A system consists of two components operating in parallel: The system functions if at least one of the components is operating. In any single period, if both components are operating at the beginning of the period, then each will fail, independently, during the period with probabilitya. When
2.1. Consider a discrete-time periodic review inventory model (see III, Section 3.1), and let , be the total demand in period n. Let X,, be the inventory quantity on hand at the end of period n. Instead of following an(s, S) policy, a (q, Q) policy will be used: If the stock level at the end of a
2.8. At the end of a month, a large retail store classifies each receivable account according to 0: Current 1: 30-60 days overdue 2: 60-90 days overdue 3: Over 90 days Each such account moves from state to state according to a Markov chain with transition probability matrixIn the long run, what
2.7. Consider a machine whose condition at any time can be observed and classified as being in one of the following three states:State 1: Good operating order State 2: Deteriorated operating order State 3: In repair We observe the condition of the machine at the end of each period in a sequence of
2.6. A component of a computer has an active life, measured in discrete units, that is a random variable T whereSuppose one starts with a fresh component, and each component is replaced by a new component upon failure. Determine the long run probability that a failure occurs in a given period.
2.5. From purchase to purchase, a particular customer switches brands among products A, B, and C according to a Markov chain whose transition probability matrix isIn the long run, what fraction of time does this customer purchase brand A? A B C A || 0.6 0.2 0.2|| P B 0.1 0.7 PB 0.2 C 0.1 0.1 0.8
2.4. Section 2.2 determined the availability R of a certain computer system to bewhere p is the computer failure probability on a single day. Compute and compare R, and R2 for p = 0.01, 0.02, 0.05, and 0.10. 1 R for one repair facility, 1 + p R 1+p 1+p+ p for two repair facilities,
2.3. Determine the average fraction inspected, AFI, and the average outgoing quality, AOQ, of Section 2.3 for p = 0, 0.05, 0.10, 0.15,. .. , 0.50 when(a) r = 10 and i = 5.(b) r = 5 and i = 10.IV The Long Run Behavior of Markov Chains
2.2. In the reliability example of Section 2.2, what fraction of time is the repair facility idle? When a second repair facility is added, what fraction of time is each facility idle?
2.1. On a Southern Pacific island, a sunny day is followed by another sunny day with probability 0.9, whereas a rainy day is followed by another rainy day with probability 0.2. Supposing that there are only sunny or rainy days, in the long run on what fraction of days is it sunny?
1.13. A Markov chain has the transition probability matrixAfter a long period of time, you observe the chain and see that it is in state 1. What is the conditional probability that the previous state was state 2? That is, find 0 1 2 0 || 0.4 0.4 0.2|| P=1 P 1 0.6 0.2 0.2 20.4 0.2 0.4||
1.12 Let P be the transition probability matrix of a finite state regular Markov chain, and let n be the matrix whose rows are the stationary distribution ir. Define Q = P - 11.(a) Show that P" = H + Q".(b) Whenobtain an explicit expression for Q' and then for P". P = 1214 -12-2 0 111 1414
1.11. Suppose that a production process changes state according to a Markov process whose transition probability matrix is given by(a) Determine the limiting probabilities .7ro and it .(b) Suppose that states 0 and 1 are "In-Control" while states 2 and 3 are deemed "Out-of-Control." In the long
1.10. Consider a Markov chain with transition probability matrixwhere 0 Show that the transition probability matrix Po P P2 PN PN Po P PN-1 PPN-1 PN Po PN-2 .... P P2 P3 Po
1.9. Determine the long run, or limiting, distribution for the Markov chain whose transition probability matrix is P = 0 1 2 3 0 || 0 0010 1 0 0 0 1 2 114 300 14 1712
Show that the transition probability matrixis regular and compute the limiting distribution. 0 1 2 3 4 0 || 0 0 00 1 00 00 P = 2 P=2 0 3 000 01 4 00 0
1.7. Determine the limiting distribution for the Markov chain whose transition probability matrix is 0 1 2 3 0 5 0 0 12 P = 1 100 201 3 || 0 10 001 0
1.6. Determine the following limits in terms of the transition probability matrix P = II1 J0 and limiting distribution 'rr = I7r 11 of a finite state regular Markov chain {Xn} 11- (a) lim (b) lim, (c) lim Pr{X+=j|x = i}. Pr{X,, = k, x+1 = j X = i). Pr{X- =k, X,, = j\X = i}.
1.5. The four towns A, B, C, and D are connected by railroad lines as shown in the following diagram:Each day, in whichever town it is in, a train chooses one of the lines out of that town at random and traverses it to the next town, where the process repeats the next day. In the long run, what is
1.4. A finite state regular Markov chain has transition probability matrix P = NP;j and limiting distribution IT = PIT;j. In the long run, what fraction of the transitions are from a presribed state k to a prescribed state m?
1.3. A Markov chain has the transition probability matrixwhere a; ? 0, i = 1, ... , 6, anda, + + a6 = 1. Determine the limiting probability of being in state 0. 0 1 2 3 4 5 1 1 0 0 0 0 0 20 1000 0 P = 3 0 0100 0 4 0 0 0 10 0 5 || 0 0001 0
1.2. Five balls are distributed between two urns, labeled A and B. Each period, one of the five balls is selected at random, and whichever urn it's in, it is moved to the other urn. In the long run, what fraction of time is urn A empty?
1.1. Five balls are distributed between two urns, labeled A and B. Each period, an urn is selected at random, and if it is not empty, a ball from that urn is removed and placed into the other urn. In the long run, what fraction of time is urn A empty?
1.10. A bus in a mass transit system is operating on a continuous route with intermediate stops. The arrival of the bus at a stop is classified into one of three states, namely 1. Early arrival;2. On-time arrival;3. Late arrival.Suppose that the successive states form a Markov chain with transition
1.9. Determine the limiting distribution for the Markov chain whose transition probability matrix is 0 1 2 1212 P=1 12 1213 20 -14 1634
1.8. Suppose that the social classes of successive generations in a family follow a Markov chain with transition probability matrix given byWhat fraction of families are upper class in the long run? Son's Class Lower Middle Upper, Lower 0.7 0.2 0.1 Father's Middle Class 0.2 0.6 0.2 Upper 0.1 0.4 0.5
1.7. A Markov chain on the states 0, 1, 2, 3 has the transition probability matrixDetermine the corresponding limiting distribution. P = 0 1 2 3 0.1 0.2 0.3 0.4 0 0.3 0.3 0.4 2 0 0 0.6 0.4 3 1 0 0 0
1.6. Compute the limiting distribution for the transition probability matrix 012 0 P=1 2 -13
1.5. Consider the Markov chain whose transition probability matrix is given byDetermine the limiting distribution for the process. 0 1 2 3 0 || 0.1 0.5 0 0.4 1 0 0 1 0 P= 2 0 00 1 3 1 00 0
1.4. A Markov chain X0, X X,.... has the transition probability matrixEvery period that the process spends in state 0 incurs a cost of $2. Every period that the process spends in state 1 incurs a cost of $5. Every period that the process spends in state 2 incurs a cost of $3. What is the long run
1.3. A Markov chain X0, X,, X2, ... has the transition probability matrixWhat fraction of time, in the long run, does the process spend in state I? 0 1 2 0 0.1 0.1 0.8 = P 1 0.2 0.2 0.6 2 0.3 0.3 0.4||
1.2. A Markov chain X0, X X2.... has the transition probability matrixDetermine the limiting distribution 0 1 2 0 0.6 0.3 0.1 || P 10.3 0.3 = 0.4 2|| 2 0.4 0.1 0.5
1.1. A Markov chain X0, X X,, . . . has the transition probability matrixDetermine the limiting distribution. 0 1 2 0 0.7 P = 1 0 0.2 0.1 0.2 0.6 0.4 2 0.5 0 0.5||
9.10. Suppose that in a branching process the number of offspring of an initial particle has a distribution whose generating function is f(s). Each member of the first generation has a number of offspring whose distribution has generating function g(s). The next generation has generating functionf,
9.9. One-fourth of the married couples in a distant society have no children at all. The other three-fourths of families continue to have children until the first girl and then cease childbearing. Assume that each child is equally likely to be a boy or girl.(a) For k = 0, 1, 2, . . . , what is the
9.8. Consider a branching process whose offspring follow the geometric distribution pk = (1 - c)ck for k = 0, 1, ... , where 0 < c < 1. Determine the probability of eventual extinction.
9.7. Families in a certain society choose the number of children that they will have according to the following rule: If the first child is a girl, they have exactly one more child. If the first child is a boy, they continue to have children until the first girl and then cease childbearing. Let 6
9.6. Let ¢(s) = as- + bs +c, wherea, b, c are positive and 4(l) = 1.Assume that the probability of extinction is u., where 0 < u,, < 1. Prove that u = cla.
9.5. At time 0, a blood culture starts with one red cell. At the end of one minute, the red cell dies and is replaced by one of the following combinations with the probabilities as indicated:Each red cell lives for one minute and gives birth to offspring in the same way as the parent cell. Each
9.4. Let 4(s) = 1 - p(1 - s),', where p and /3 are constants with 0 /3 (s) 1p+B+ (1 - s)" for n = 1, 2, ....
9.3. Consider a large region consisting of many subareas. Each subarea contains a branching process that is characterized by a Poisson distribution with parameter A. Assume, furthermore, that the value of A varies with the subarea, and its distribution over the whole region is that of a gamma
9.2. One-fourth of the married couples in a far-off society have exactly three children. The other three-fourths of families continue to have children until the first boy and then cease childbearing. Assume that each child is equally likely to be a boy or girl. What is the probability that the male
9.1. One-fourth of the married couples in a far-off society have no children at all. The other three-fourths of families have exactly three children, each child equally likely to be a boy or a girl. What is the probability that the male line of descent of a particular husband will eventually die
9.4. Let ¢(s) be the generating function of an offspring random variable 6. Let Z be a random variable whose distribution is that of , but conditional on > 0. That is,Express the generating function for Z in terms of 4. Pr{Z=k} = Pr{ = k| > 0} for k = 1, 2, ....
9.3. Determine the probability generating function corresponding to the offspring distribution in which each individual produces 0 or N direct descendants, with probabilities p and q, respectively.
9.2 Determine the probability generating function for the offspring distribution in which an individual either dies, with probability po, or is replaced by two progeny, with probability P21 where po + P2 = 1.
9.1. Suppose that the offspring distribution is Poisson with mean A = 1.1. Compute the extinction probabilities u = Pr (X,, = 01X0 = I) for n = 0, 1, . . . , 5. What is ux, the probability of ultimate extinction?
8.4. Let {Xn} be a branching process with mean family size μ. Show that Z = is a nonnegative martingale. Interpret the maximal inequality as applied to {Zn}
8.3. Families in a certain society choose the number of children that they will have according to the following rule: If the first child is a girl, they have exactly one more child. If the first child is a boy, they continue to have children until the first girl, and then cease childbearing.(a) For
8.2. Let Z = Y,; =o X,, be the total family size in a branching process whose offspring distribution has a mean μ = E[6] < 1. Assuming that Xo = 1, show that E[Z] = 1/(1 - μ).
8.1. Each adult individual in a population produces a fixed number M of offspring and then dies. A fixed number L of these remain at the location of the parent. These local offspring will either all grow to adulthood, which occurs with a fixed probability 6, or all will die, which has probability 1
8.4. At each stage of an electron multiplier, each electron, upon striking the plate, generates a Poisson distributed number of electrons for the next stage. Suppose the mean of the Poisson distribution is A. Determine the mean and variance for the number of electrons in the nth stage.
8.3. Suppose a parent has no offspring with probability ; and has two offspring with probability . If a population of such individuals begins with a single parent and evolves as a branching process, determine u,,, the probability that the population is extinct by the nth generation, for n = 1,
8.2. The number of offspring of an individual in a popualtion is 0, 1, or 2 with respective probabilities a > 0, b > 0, and c > 0, where a + b +c = 1. Express the mean and variance of the offspring distribution in terms of b and c.
8.1. A population begins with a single individual. In each generation, each individual in the population dies with probability ; or doubles with probability ;. Let X,, denote the number of individuals in the population in the nth generation. Find the mean and variance of X,,.
7.5. Computer Challenge. Consider the partial sums:Can you find an explicit formula for the mean time Vk for the partial sums starting from So = k to exit the interval [0, N] = {0, 1, . . . , N}? In another context, the answer was found by computing it in a variety of special cases.(Note: A simple
7.4. The possible states for a Markov chain a r e the integers 0, 1, ... , N, and if the chain is in state j, at the next step it is equally likely to be in any of the states 0, 1, . . . , j - 1. Formally,(a) Determine the fundamental matrix for the transient states 1,2,...,N.(b) Determine the
7.3. Let X, be an absorbing Markov chain whose transition probability matrix takes the form given in equation (7.1). Let W be the fundamental matrix, the matrix inverse of I - Q. Let T=min{n?0;rSn -5 N}be the random time of absorption (recall that states r, r + 1, ... , N are the absorbing states).
7.2. A zero-seeking device operates as follows: If it is in state j at time n, then at time n + I its position is 0 with probability 1/j, and its position is k (where k is one of the states 1, 2, . . . , j - 1) with probability 2k/j2.State 0 is absorbing. Find the inverse of the I - Q matrix.
7.1. A zero-seeking device operates as follows: If it is in state m at time n, then at time n + 1 its position is uniformly distributed over the states 0, 1, . . . , m - 1. State 0 is absorbing. Find the inverse of the I - Q matrix for the transient states 1, 2, . . . , m.
7.2. Consider the random walk Markov chain whose transition probability matrix is given byThe transition probability matrix corresponding to the nonabsorbing states isCalculate the matrix inverse to I - Q, and from this determine (a) the probability of absorption into state 0 starting from state
7.1. Consider the Markov chain whose transition probability matrix is given byThe transition probability matrix corresponding to the nonabsorbing states isCalculate the matrix inverse to I - Q, and from this determine (a) the probability of absorption into state 0 starting from state 1;(b) the mean
6.3. Computer Challenge. You have two urns: A and B, with a balls in A and b in B. You pick an urn at random, each urn being equally likely, and move a ball from it to the other urn. You do this repeatedly. The game ends when either of the urns becomes empty. The number of balls in A at the nth
6.2. Consider the Markov chain {Xn} whose transition matrix iswhere a > 0, /3 > 0, and a + /3 = 1. Determine the mean time to reach state 3 starting from state 0. That is, find E[TIXo = 0], where T = min{n ? 0; X = 3}. P = 0 0 1 2 3 0 0 0 0 B 2. 0 0 3 0 0 0 1
6.1. Consider the random walk Markov chain whose transition probability matrix is given byStarting in state 1, determine the mean time until absorption. Do this first using the basic first step approach of equations (4.6), and second using the particular results for a random walk given in equation
6.4. Consider the random walk Markov chain whose transition probability matrix is given byStarting in state 1, determine the mean time until absorption. Do this first using the basic first step approach of equations (4.6), and second using the particular formula for v; that follows equation (6.18),
6.3. Players A and B each have $50 at the beginning of a game in which each player bets $1 at each play, and the game continues until one player is broke. Suppose there is a constant probability p = 0.492929... that Player A wins on any given bet. What is the mean duration of the game?
6.2. Customer accounts receivable at Smith Company are classified each month according to 0: Current 1: 30 to 60 days past due 2: 60 to 90 days past due 3: Over 90 days past due Consider a particular customer account and suppose that it evolves month to month as a Markov chain {X } whose transition
6.1. A rat is put into the linear maze as shown:(a) Assume that the rat is equally likely to move right or left at each step. What is the probability that the rat finds the food before getting shocked?(b) As a result of learning, at each step the rat moves to the right with probability p > z and
6.3. Fix a state k, where 0 Then Wik satisfies the equations where WE [' 1{x, = k}\X = i] n=0 1 (X = k} = {1 = i]. if X,, = k, if X,, * k.
6.2. The mean hitting timesatisfies the equationsThe solution iswhere p; is given in (6.10) and V = E[T\X = k] Vk
6.1. The probability of gambler's ruinsatisfies the first step analysis equationand U0= 1, uN=0.The solution iswhere u, Pr{X, = 0X = i}
5.5 Let {Xn} be a random walk for which zero is an absorbing state and such that from a positive state, the process is equally likely to go up or down one unit. The transition probability matrix is given by (5.9) with r0 = 1 and p, = q; = ; for i ? 1. (a) Show that {Xn} is a nonnegative
5.4. Martha has a fair die with the usual six sides. She throws the die and records the number. She throws the die again and adds the second number to the first. She repeats this until the cumulative sum of all the tosses first exceeds 10. What is the probability that she stops at a cumulative sum
5.3. A Batch Processing Model. Customers arrive at a facility and wait there until K customers have accumulated. Upon the arrival of the Kth customer, all are instantaneously served, and the process repeats. Let eo, e . . . denote the arrivals in successive periods, assumed to be independent random
5.2. A component of a computer has an active life, measured in discrete units, that is a random variable T, where Pr{T = k} = a,, for k = 1, 2.... .Suppose one starts with a fresh component, and each component is replaced by a new component upon failure. Let X,, be the age of the component in
5.1. As a special case of the successive maxima Markov chain whose transition probabilities are given in equation (5.5), consider the Markov chain whose transition probability matrix is given byStarting in state 0, show that the mean time until absorption is vo = 1/a3. 0 1 2 3 0 do a a 1 0 ao + a
5.9. In a simplified model of a certain television game show, suppose that the contestant, having won k dollars, will at the next play have k + 1 dollars with probability q and be put out of the game and leave with nothing with probability p = 1 - q. Suppose that the contestant begins with one
5.8. As a special case, consider a discrete-time queueing model in which at most a single customer arrives in any period and at most a single customer completes service. Suppose that in any single period, a single customer arrives with probabilitya, and no customers arrive with probability 1 -a.
5.7. Consider the random walk Markov chain whose transition probability matrix is given byStarting in state 1, determine the probability that the process is absorbed into state 0. Do this first using the basic first step approach of equations (4.3) and (4.4), and second using the particular results
5.6. A baseball trading card that you have for sale may be quite valuable.Suppose that the successive bids ,, i2.... that you receive are independent random variables with the geometric distribution Pr( = k) = 0.01(0.99)k for k = 0, 1.... .If you decide to accept any bid over $100, how many bids,
5.5. Suppose that the items produced by a certain process are each graded as defective or good, and that whether or not a particular item is defective or good depends on the quality of the previous item. To be specific, suppose that a defective item is followed by another defective item with
5.4. A coin is tossed repeatedly until three heads in a row appear. Let X,, record the current number of successive heads that have appeared. That is, X = 0 if the nth toss resulted in tails; X,, = 1 if the nth toss was heads and the (n - 1)st toss was tails; and so on. Model X,, as a success runs
5.3. Determine P" for n = 2, 3, 4, 5 for the Markov chain whose transition probability matrix is 0.4 P = 0.7 0.6|| 0.3
5.2. Determine the gambler's ruin probability for Player A when both players begin with $50, bet $1 on each play, and where the win probability for Player A in each game is(a) p = 0.49292929(b) p = 0.5029237(See II, Section 2.)What are the gambler's ruin probabilities when each player begins
5.1. The probability of the thrower winning in the dice game called"craps" is p = 0.4929. Suppose Player A is the thrower and begins the game with $5, and Player B, his opponent, begins with $10. What is the probability that Player A goes bankrupt before Player B? Assume that the bet is $1 per
4.19. Computer Challenge. Let N be a positive integer and let Z . . . , Z be independent random variables, each having the geometric distributionSince these are discrete random variables, the maximum among them may be unique, or there may be ties for the maximum. Let p,,, be the probability that
4.18. Time-dependent transition probabilities. A well-disciplined man, who smokes exactly one half of a cigar each day, buys a box containing N cigars. He cuts a cigar in half, smokes half, and returns the other half to the box. In general, on a day in which his cigar box contains w whole cigars
4.17. The damage X of a system subjected to wear is a Markov chain with the transition probability matrixThe system starts in state 0 and fails when it first reaches state 2. Let T = min( n ? 0; X = 2) be the time of failure. Use a first step analysis to evaluate ¢(s) = E[sn ] for a fixed number 0
4.16. An urn contains 5 tags, of which 3 are red and 2 are green. A tag is randomly selected from the urn and replaced with a tag of the opposite color. This continues until only tags of a single color remain in the urn.Let X denote the number of red tags in the urn after the nth draw, with Xo = 3.
Showing 4900 - 5000
of 6914
First
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
Last
Step by Step Answers