New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
probability and stochastic modeling
An Introduction To Stochastic Modeling 4th Edition Mark A. Pinsky, Samuel Karlin - Solutions
4.1.10 Consider a Markov chain with transition probability matrixwhere 0 CpN D1. Determine the limiting distribution. Po P1 P2 PN PN Po P1 PN-1 PN-2 PPN-1 PN Po : Pl P2 P3
4.1.9 Determine the long run, or limiting, distribution for the Markov chain whose transition probability matrix is 0 1 2 3 001 10 0 0 P=2 300 1412 1412
4.1.8 Show that the transition probability matrixis regular and compute the limiting distribution. 1 0 1 2 3 4 0 12 21212 0 0 13 1213. -2 P=2 300 12 0 0 0 00 0 12 13 0 12
4.1.7 Determine the limiting distribution for the Markov chain whose transition probability matrix is P = 0 112 0 1 2 3 0 0 12 11 0 0 0 20 911 113 12 300 10
4.1.6 Determine the following limits in terms of the transition probability matrix P D kPijk and limiting distribution D kjk of a finite-state regular Markov chain fXng:(a) limn!1PrfXnC1 D jjX0 D ig.(b) limn!1PrfXn D k;XnC1 D jjX0 D ig.(c) limn!1PrfXn????1 D k;Xn D jjX0 D ig.
4.1.5 The four towns A;B;C, and D are connected by railroad lines as shown in the following diagram:Each day, in whichever town it is in, a train chooses one of the lines out of that town at random and traverses it to the next town, where the process repeats the next day. In the long run, what is
4.1.4 A finite-state regular Markov chain has transition probability matrix P D kPijk and limiting distribution D kik. In the long run, what fraction of the transitions are from a prescribed state k to a prescribed state m?
4.1.3 A Markov chain has the transition probability matrixwhere i 0; i D 1; : : : ; 6, and 1 C C6 D 1. Determine the limiting probability of being in state 0. 11 0 20 1 2800 0 1 2 3 4 5 4 0 0 0 0 0 0 P = 3 0 0 1 0 0 0 40 0 0 1 0 0 5 0 0 00 1 0
4.1.2 Five balls are distributed between two urns, labeled A and B. Each period, one of the five balls is selected at random, and whichever urn it’s in, it is moved to the other urn. In the long run, what fraction of time is urn A empty?
4.1.1 Five balls are distributed between two urns, labeled A and B. Each period, an urn is selected at random, and if it is not empty, a ball from that urn is removed and placed into the other urn. In the long run what fraction of time is urn A empty?
4.1.10 A bus in a mass transit system is operating on a continuous route with intermediate stops. The arrival of the bus at a stop is classified into one of three states, namely 1. Early arrival;2. On-time arrival;3. Late arrival.Suppose that the successive states form a Markov chain with
4.1.9 Determine the limiting distribution for the Markov chain whose transition probability matrix is 0 1 2 0 1634 121214 1213 P = 1 20
4.1.8 Suppose that the social classes of successive generations in a family follow a Markov chain with transition probability matrix given byWhat fraction of families are upper class in the long run? Lower Son's class Middle Upper Lower 0.7 0.2 0.1 Father's Middle 0.2 0.6 0.2 class Upper 0.1 0.4 0.5
4.1.7 A Markov chain on the states 0;1;2;3 has the transition probability matrixDetermine the corresponding limiting distribution. 0 1 2 3 0 0.1 0.2 0.3 0.4| 1 0 0.3 0.3 0.4 P= 2 3 0 0 0.6 0.4 100 0
4.1.6 Compute the limiting distribution for the transition probability matrix P = 1 2 0 1313 1213 12 -23 6 2
4.1.5 Consider the Markov chain whose transition probability matrix is given byDetermine the limiting distribution for the process. 0 1 2 3 0|0.1 0.5 0 0.4 1 0 0 1 0 P = 2 0 0 0 1 3. 1 00 0
4.1.4 A Markov chain X0;X1;X2; : : : has the transition probability matrixEvery period that the process spends in state 0 incurs a cost of $2. Every period that the process spends in state 1 incurs a cost of $5. Every period that the process spends in state 2 incurs a cost of $3. What is the long
4.1.3 A Markov chain X0;X1;X2; : : : has the transition probability matrixWhat fraction of time, in the long run, does the process spend in state 1? 012 0 0.1 0.1 0.8 P 1 0.2 0.2 0.6 P=1 2 0.3 0.3 0.4
4.1.2 A Markov chain X0;X1;X2; : : : has the transition probability matrixDetermine the limiting distribution. 012 0 0.6 0.3 0.1| 0|0.6 P 1 0.3 0.3 0.4 2 0.4 0.1 0.5
4.1.1 A Markov chain X0;X1;X2; : : : has the transition probability matrixDetermine the limiting distribution. 012 0 0.7 0.2 0.1 P 10 0.6 0.4 2 0.5 0 0.5 20.5
3.9.10 Suppose that in a branching process the number of offspring of an initial particle has a distribution whose generating function is f .s/. Each member of the first generation has a number of offspring whose distribution has generating function g.s/. The next generation has generating function
3.9.9 One-fourth of the married couples in a distant society have no children at all.The other three-fourths of couples continue to have children until the first girl and then cease childbearing. Assume that each child is equally likely to be a boy or girl.(a) For k D 0;1;2; : : : , what is the
3.9.8 Consider a branching process whose offspring follow the geometric distribution pk D .1????c/ck for k D 0;1; : : : , where 0 < c < 1. Determine the probability of eventual extinction.
3.9.7 Families in a certain society choose the number of children that they will have according to the following rule: If the first child is a girl, they have exactly one more child. If the first child is a boy, they continue to have children until the first girl and then cease childbearing. Let
3.9.6 Let .s/ D as2 CbsCc, where a;b; c are positive and .1/ D 1. Assume that the probability of extinction is u1, where 0 < u1 < 1. Prove that u1 D c=a.
3.9.5 At time 0, a blood culture starts with one red cell. At the end of 1 min, the red cell dies and is replaced by one of the following combinations with the probabilities as indicated:Each red cell lives for 1 min and gives birth to offspring in the same way as the parent cell. Each white cell
3.9.4 Let .s/ D 1????p.1????s/, where p and are constants with 0 Prove that .s/ is a probability generating function and that its iterates are n(s) = 1-p+++-1 (1-s)" for n = 1, 2,....
3.9.3 Consider a large region consisting of many subareas. Each subarea contains a branching process that is characterized by a Poisson distribution with parameter. Assume, furthermore, that the value of varies with the subarea, and its distribution over the whole region is that of a gamma
3.9.2 One-fourth of the married couples in a far-off society have exactly three children.The other three-fourths of couples continue to have children until the first boy and then cease childbearing. Assume that each child is equally likely to be a boy or girl. What is the probability that the male
3.9.1 One-fourth of the married couples in a far-off society have no children at all.The other three-fourths of couples have exactly three children, with each child equally likely to be a boy or a girl. What is the probability that the male line of descent of a particular husband will eventually
3.9.4 Let .s/ be the generating function of an offspring random variable . Let Z be a random variable whose distribution is that of , but conditional on > 0.That is,Express the generating function for Z in terms of . Pr{Z k) Pr{=k|>0} for k = 1, 2,....
3.9.3 Determine the probability generating function corresponding to the offspring distribution in which each individual produces 0 or N direct descendants, with probabilities p and q, respectively.
3.9.2 Determine the probability generating function for the offspring distribution in which an individual either dies, with probability p0, or is replaced by two progeny, with probability p2, where p0 Cp2 D 1.
3.9.1 Suppose that the offspring distribution is Poisson with mean D 1:1. Compute the extinction probabilities un D PrfXn D 0jX0 D 1g for n D 0;1; : : : ; 5. What is u1, the probability of ultimate extinction?
3.8.4 Let fXng be a branching process with mean family size . Show that Zn D Xn=n is a nonnegative martingale. Interpret the maximal inequality as applied to fZng.
3.8.3 Families in a certain society choose the number of children that they will have according to the following rule: If the first child is a girl, they have exactly one more child. If the first child is a boy, they continue to have children until the first girl, and then cease childbearing.(a)
3.8.2 Let Z D Px nD0 Xn be the total family size in a branching process whose offspring distribution has a mean D E[ ] < 1. Assuming that X0 D 1, show that E[Z] D 1=.1????/.
3.8.1 Each adult individual in a population produces a fixed number M of offspring and then dies. A fixed number L of these remain at the location of the parent.These local offspring will either all grow to adulthood, which occurs with a fixed probability , or all will die, which has probability
3.8.4 At each stage of an electron multiplier, each electron, upon striking the plate, generates a Poisson distributed number of electrons for the next stage. Suppose the mean of the Poisson distribution is . Determine the mean and variance for the number of electrons in the nth stage.
3.8.3 Suppose a parent has no offspring with probability 1 2 and has two offspring with probability 12. If a population of such individuals begins with a single parent and evolves as a branching process, determine un, the probability that the population is extinct by the nth generation, for n D
3.8.2 The number of offspring of an individual in a population is 0; 1, or 2 with respective probabilities a > 0, b > 0, and c > 0, where aCbCc D 1. Express the mean and variance of the offspring distribution in terms of b and c.
3.8.1 A population begins with a single individual. In each generation, each individual in the population dies with probability 1 2 or doubles with probability 1 2 . Let Xn denote the number of individuals in the population in the nth generation. Find the mean and variance of Xn.
3.7.5 Computer Challenge. Consider the partial sums:where 1; 2; : : : are independent and identically distributed asandCan you find an explicit formula for the mean time vk for the partial sums starting from S0 D k to exit the interval [0;N] D f0;1; : : : ;Ng? In another context, the answer was
3.7.4 The possible states for a Markov chain are the integers 0;1; : : : ;N, and if the chain is in state j, at the next step it is equally likely to be in any of the states 0;1; : : : ; j????1. Formally,(a) Determine the fundamental matrix for the transient states 1;2; : : : ;N.(b) Determine the
3.7.3 Let Xn be an absorbing Markov chain whose transition probability matrix takes the form given in equation (3.76). Let W be the fundamental matrix, the matrix inverse of I????Q. Letbe the random time of absorption (recall that states r; rC1; : : : ;N are the absorbing states). Establish the
3.7.2 A zero-seeking device operates as follows: If it is in state j at time n, then at time nC1 its position is 0 with probability 1=j, and its position is k (where k is one of the states 1;2; : : : ; j????1) with probability 2k=j2. State 0 is absorbing.Find the inverse of the I????Q matrix.
3.7.1 A zero-seeking device operates as follows: If it is in state m at time n, then at time nC1 its position is uniformly distributed over the states 0;1; : : : ;m????1.State 0 is absorbing. Find the inverse of the I????Q matrix for the transient states 1;2; : : : ;m.
3.7.2 Consider the random walk Markov chain whose transition probability matrix is given byThe transition probability matrix corresponding to the nonabsorbing states isCalculate the matrix inverse to I????Q, and from this determine (a) the probability of absorption into state 0 starting from state
3.7.1 Consider the Markov chain whose transition probability matrix is given byThe transition probability matrix corresponding to the nonabsorbing states isCalculate the matrix inverse to I????Q, and from this determine (a) the probability of absorption into state 0 starting from state 1;(b) the
3.6.9 Computer Challenge. You have two urns: A and B, with a balls in A and b balls in B. You pick an urn at random, each urn being equally likely, and move a ball from it to the other urn. You do this repeatedly. The game ends when either of the urns becomes empty. The number of balls in A at the
3.6.8 Consider the Markov chain fXng whose transition matrix iswhere > 0; > 0, and C D 1. Determine the mean time to reach state 3 starting from state 0. That is, find E[TjX0 D 0], where T D minfn 0IXn D 3g. 0 1 2 3 0 1a 00 B P = 2a 0 B00 300 0 0 1
3.6.7 Consider the random walk Markov chain whose transition probability matrix is given byStarting in state 1, determine the mean time until absorption. Do this first using the basic first step approach of equation (3.24) and second using the particular results for a random walk given in equation
3.6.6 Fix a state j.0 Pi+qi Wij= 9j-1 (5+) (1491) +9 Pi+qi Pj-1+aj-1Pj+9j for j = i, for i j.
3.6.5 The mean hitting time vk D E[TjX0 D k] (3.72)satisfies the equation vk D 1Crkvk CqkvkC1 for k D 1; : : : ;N ????1 and v0 D vN D 0:The solution is 1 Jk,k+1 k,N-1 Vk= + + Pk+qk Pk+1+9k+1 PN-1+qN-1 where 9k 9k+1 9j-1 kj Pk+qk Pk+1+9k+1. Pj-1+9-1 for k
3.6.4 The probability of absorption at 0 starting from state k uk D PrfXT D 0jX0 D kg (3.70)satisfies the equation uk D pk Crkuk CqkukC1;for k D 1; : : : ;N ????1 and u0 D 1;uN D 0: The solution is uk = 1- qk QN-1 Pk+9k/ PN-1+9N-1. for k 1,..., N-1. =
3.6.3 Fix a state k, where 0 whereThe solution is Wik=E1{Xn=k}|Xo= Ln=0
3.6.2 The mean hitting timesatisfies the equationsThe solution iswhere i is given in (3.63) and v=E[T|Xo=k]
3.6.1 The probability of gambler’s ruinsatisfies the first step analysis equationandThe solution iswhere u = Pr{XT=0X0 = i}
3.6.4 Consider the random walk Markov chain whose transition probability matrix is given byStarting in state 1, determine the mean time until absorption. Do this first using the basic first step approach of equation (3.24), and second using the particular formula for vi that follows equation
3.6.3 Players A and B each have $50 at the beginning of a game in which each player bets $1 at each play, and the game continues until one player is broke. Suppose there is a constant probability p D 0:492929: : : that Player A wins on any given bet. What is the mean duration of the game?
3.6.2 Customer accounts receivable at Smith Company are classified each month according to 0: Current 1: 30–60 days past due 2: 60–90 days past due 3: Over 90 days past due Consider a particular customer account and suppose that it evolves month to month as a Markov chain fXng whose transition
3.6.1 A rat is put into the linear maze as shown:a) Assume that the rat is equally likely to move right or left at each step. What is the probability that the rat finds the food before getting shocked?(b) As a result of learning, at each step the rat moves to the right with probability p > 12
3.5.5 Let fXng be a random walk for which zero is an absorbing state and such that from a positive state, the process is equally likely to go up or down one unit.The transition probability matrix is given by (3.38) with r0 D 1 and pi D qi D 1 2for i 1. (a) Show that fXng is a nonnegative
3.5.4 Martha has a fair die with the usual six sides. She throws the die and records the number. She throws the die again and adds the second number to the first. She repeats this until the cumulative sum of all the tosses first exceeds 10. What is the probability that she stops at a cumulative sum
3.5.3 A Batch Processing Model. Customers arrive at a facility and wait there until K customers have accumulated. Upon the arrival of the Kth customer, all are instantaneously served, and the process repeats. Let 0; 1; : : : denote the arrivals in successive periods, assumed to be independent
3.5.2 A component of a computer has an active life, measured in discrete units, that is a random variable T, where PrfT D kg D ak for k D 1;2; : : : . Suppose one starts with a fresh component, and each component is replaced by a new component upon failure. Let Xn be the age of the component in
3.5.1 As a special case of the successive maxima Markov chain whose transition probabilities are given in equation (3.34), consider the Markov chain whose transition probability matrix is given byStarting in state 0, show that the mean time until absorption is v0 D 1=a3. 0 1 2 3 0 ao a2 az 10
3.5.9 In a simplified model of a certain television game show, suppose that the contestant, having won k dollars, will at the next play have kC1 dollars with probability q and be put out of the game and leave with nothing with probability p D 1????q. Suppose that the contestant begins with one
3.5.8 As a special case, consider a discrete-time queueing model in which at most a single customer arrives in any period and at most a single customer completes service. Suppose that in any single period, a single customer arrives with probability, and no customers arrive with probability 1????.
3.5.7 Consider the random walk Markov chain whose transition probability matrix is given byStarting in state 1, determine the probability that the process is absorbed into state 0. Do this first using the basic first step approach of equations (3.21)and (3.22) and second using the particular
3.5.6 A baseball trading card that you have for sale may be quite valuable. Suppose that the successive bids 1; 2; : : : that you receive are independent random variables with the geometric distribution Prf D kg D 0:01.0:99/k for k D 0;1; : : : :If you decide to accept any bid over $100, how
3.5.5 Suppose that the items produced by a certain process are each graded as defective or good and that whether or not a particular item is defective or good depends on the quality of the previous item. To be specific, suppose that a defective item is followed by another defective item with
3.5.4 A coin is tossed repeatedly until three heads in a row appear. Let Xn record the current number of successive heads that have appeared. That is, Xn D 0 if the nth toss resulted in tails; Xn D 1 if the nth toss was heads and the .n????1/st toss was tails; and so on. Model Xn as a success runs
3.5.3 Determine Pn for n D 2;3;4;5 for the Markov chain whose transition probability matrix is 0.4 P= 0.6| 0.7 0.3
3.5.2 Determine the gambler’s ruin probability for Player A when both players begin with $50, bet $1 on each play, and where the win probability for Player A in each game is(a) p D 0:49292929(b) p D 0:5029237(See Chapter 2, Section 2.2.)What are the gambler’s ruin probabilities when each player
3.5.1 The probability of the thrower winning in the dice game called “craps” is p D 0:4929. Suppose Player A is the thrower and begins the game with $5, and Player B, his opponent, begins with $10. What is the probability that Player A goes bankrupt before Player B? Assume that the bet is $1
3.4.19 Computer Challenge. Let N be a positive integer and let Z1; : : : ;ZN be independent random variables, each having the geometric distributionSince these are discrete random variables, the maximum among them may be unique, or there may be ties for the maximum. Let pN be the probability that
3.4.18 Time-dependent transition probabilities. A well-disciplined man, who smokes exactly one half of a cigar each day, buys a box containing N cigars. He cuts a cigar in half, smokes half, and returns the other half to the box. In general, on a day in which his cigar box contains w whole cigars
3.4.17 The damage Xn of a system subjected to wear is a Markov chain with the transition probability matrixThe system starts in state 0 and fails when it first reaches state 2. Let T D minfn 0IXn D 2g be the time of failure. Use a first step analysis to evaluate .s/ D E sT for a fixed number 0 01
3.4.16 An urn contains five tags, of which three are red and two are green. A tag is randomly selected from the urn and replaced with a tag of the opposite color.This continues until only tags of a single color remain in the urn. Let Xn denote the number of red tags in the urn after the nth draw,
3.4.15 A simplified model for the spread of a rumor goes this way: There are N D 5 people in a group of friends, of which some have heard the rumor and the others have not. During any single period of time, two people are selected at random from the group and assumed to interact. The selection is
3.4.14 A single die is rolled repeatedly. The game stops the first time that the sum of two successive rolls is either 5 or 7. What is the probability that the game stops at a sum of 5?
3.4.13 A Markov chain X0;X1;X2; : : : has the transition probability matrixand is known to start in state X0 D 0. Eventually, the process will end up in state 2. What is the probability that the time T D minfn 0IXn D 2g is an odd number? 012 0 0.3 0.2 0.5 P=10.5 P 1 0.5 0.1 0.4 2 0 0 00 1
3.4.12 A Markov chain X0;X1;X2; : : : has the transition probability matrixand is known to start in state X0 D 0. Eventually, the process will end up in state 2. What is the probability that when the process moves into state 2, it does so from state 1? 0 1 2 0 0.3 0.2 0.5 P 1 0.5 0.1 0.4 P=10.5 200
3.4.11 An urn contains two red and two green balls. The balls are chosen at random, one by one, and removed from the urn. The selection process continues until all of the green balls have been removed from the urn. What is the probability that a single red ball is in the urn at the time that the
3.4.10 You have five fair coins. You toss them all so that they randomly fall heads or tails. Those that fall tails in the first toss you pick up and toss again. You toss again those that show tails after the second toss, and so on, until all show heads. Let X be the number of coins involved in the
3.4.9 An urn contains five red and three yellow balls. The balls are chosen at random, one by one, from the urn. Each ball removed is replaced in the urn by a yellow ball. The selection process continues until all of the red balls have been removed from the urn. What is the mean duration of the
3.4.8 An urn contains five red and three green balls. The balls are chosen at random, one by one, from the urn. If a red ball is chosen, it is removed. Any green ball that is chosen is returned to the urn. The selection process continues until all of the red balls have been removed from the urn.
3.4.7 Let Xn be a Markov chain with transition probabilities Pij. We are given a“discount factor” with 0 Using a first step analysis show that hi satisfies the system of linear equations h=EB"c(X) =0 B"c(Xn)|Xo=i i].
3.4.6 Consider the Markov chain whose transition matrix iswhere pCq D 1. Determine the mean time to reach state 4 starting from state 0.That is, find E[TjX0 D 0], where T D minfn 0IXn D 4g.Hint: Let vi D E[TjX0 D i] for i D 0;1; : : : ; 4. Establish equations for v0; v1; : : : ; v4 by using a
3.4.5 A white rat is put into compartment 4 of the maze shown here:It moves through the compartments at random; i.e., if there are k ways to leave a compartment, it chooses each of these with probability 1=k. What is the probability that it finds the food in compartment 3 before feeling the
3.4.4 Consider the Markov chain whose transition probability matrix is given byStarting in state X0 D 1, determine the probability that the process never visits state 2. Justify your answer. P= 01 12 0 1 10 3 0 1 0.1 0.2 0.5 0.2 2 0.1 0.2 0.6 0.1 3 0.2 0.2 0.3 0.3
3.4.3 A zero-seeking device operates as follows: If it is in state j at time n, then at time nC1, its position is 0 with probability 1=j, and its position is k (where k is one of the states 1;2; : : : ; j????1) with probability 2k=j2. Find the expected time until the device first hits zero starting
3.4.2 A zero-seeking device operates as follows: If it is in state m at time n, then at time nC1, its position is uniformly distributed over the states 0;1; : : : ;m????1.Find the expected time until the device first hits zero starting from state m.Note: This is a highly simplified model for an
3.4.1 Which will take fewer flips, on average: successively flipping a quarter until the pattern HHT appears, i.e., until you observe two successive heads followed by a tails; or successively flipping a quarter until the pattern HTH appears? Can you explain why these are different?
3.4.9 Consider the Markov chain whose transition probability matrix is given byStarting in state 1, determine the probability that the process is absorbed into state 0. Compare this with the .1;0/th entry in the matrix powers P2;P4;P8, and P16. P= 0 1 2 3 01 100 1 0.1 0.2 0.5 0.2 2 0.1 0.2 0.6 0.1
3.4.8 Consider the Markov chain whose transition probability matrix is given byStarting in state 1, determine the mean time that the process spends in state 1 prior to absorption and the mean time that the process spends in state 2 prior to absorption. Verify that the sum of these is the mean time
3.4.7 Consider the Markov chain whose transition probability matrix is given byStarting in state 1, determine the mean time that the process spends in state 1 prior to absorption and the mean time that the process spends in state 2 prior to absorption. Verify that the sum of these is the mean time
3.4.6 Consider the Markov chain whose transition probability matrix is given by(a) Starting in state 1, determine the probability that the Markov chain ends in state 0.(b) Determine the mean time to absorption. P = 012 0 1 3 00 0 1 0.1 0.4 0.1 0.4 2 0.2 0.1 0.6 0.1 3 000 1
3.4.5 A coin is tossed repeatedly until either two successive heads appear or two successive tails appear. Suppose the first coin toss results in a head. Find the probability that the game ends with two successive tails.
Showing 4300 - 4400
of 6914
First
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
Last
Step by Step Answers