New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
introduction to probability statistics
Introduction To Probability Models 12th Edition Sheldon M Ross - Solutions
16. There are three jobs that need to be processed, with the processing time of job i being exponential with rate . There are two processors available, so processing on two of the jobs can immediately start, with processing on the final job to start when one of the initial ones is finished.(a) Let
15. One hundred items are simultaneously put on a life test. Suppose the lifetimes of the individual items are independent exponential random variables with mean 200 hours. The test will end when there have been a total of 5 failures. If T is the time at which the test ends, find and .
14. I am waiting for two friends to arrive at my house. The time until A arrives is exponentially distributed with rate , and the time until B arrives is exponentially distributed with rate Once they arrive, both will spend exponentially distributed times, with respective rates and at my home
13. Find, in Example 5.10, the expected time until the nth person on line leaves the line (either by entering service or departing without service).
12. If , are independent exponential random variables with rates , , find(a) ,(b) ,(c) ,(d) .
11. Let be independent exponential random variables; X having rate λ, andI mage having rate μ. Let be the event that the jth smallest of these random variables is one of theI mage. Find, by using the identity Verify your answer when by conditioning on X to obtain p.
*10. Let X and Y be independent exponential random variables with respective rates λ and μ. Let . Find(a) ,(b) ,(c) .
9. Machine 1 is currently working. Machine 2 will be put in use at a time t from now. If the lifetime of machine i is exponential with rate , what is the probability that machine 1 is the first machine to fail?
8. If X and Y are independent exponential random variables with respective rates λ and μ, what is the conditional distribution of X given that ?
*7. If and are independent nonnegative continuous random variables, show that where is the failure rate function of .
6. In Example 5.3 if server i serves at an exponential rate , show that
*5. If X is exponential with rate λ, show that is geometric with parameter where is the largest integer less than or equal to x.
4. Consider a post office with two clerks. Three people, A, B, and C, enter simultaneously. A and B go directly to the clerks, and C waits until either A or B leaves before he begins service. What is the probability that A is still in the post office after the other two have left when(a) the
3. Let X be an exponential random variable. Without any computations, tell which one of the following is correct. Explain your answer.(a)(b)(c)
2. Suppose that you arrive at a single-teller bank to find five other customers in the bank, one being served and the other four waiting in line. You join the end of the line. If the service times are all exponential with rate μ, what is the expected amount of time you will spend in the bank?
1. The time T required to repair a machine is an exponentially distributed random variable with mean (hours).(a) What is the probability that a repair time exceeds hour?(b) What is the probability that a repair takes at least hours given that its duration exceeds 12 hours?
79. In Example 4.45, what is the probability that the first 4 items produced are all acceptable?
78. For the Markov chain of Exercise 5, suppose thatI mage is the probability that signal s is emitted when the underlying Markov chain state isI mage.(a) What proportion of emissions are signal s?(b) What proportion of those times in which signal s is emitted is 0 the underlying state?
77. In a Markov decision problem, another criterion often used, different than the expected average return per unit time, is that of the expected discounted return. In this criterion we choose a numberI mage, and try to choose a policy so as to maximize(that is, rewards at time n are discounted at
76. On a chessboard compute the expected number of plays it takes a knight, starting in one of the four corners of the chessboard, to return to its initial position if we assume that at each play it is equally likely to choose any of its legal moves. (No other pieces are on the board.)Hint: Make
75. A Markov chain is said to be a tree process if(i) Image whenever Image,(ii) for every pair of states i and Image, there is a unique sequence of distinct states Image such that Image In other words, a Markov chain is a tree process if for every pair of distinct states i and j there is a unique
74. A group of n processors is arranged in an ordered list. When a job arrives, the first processor in line attempts it; if it is unsuccessful, then the next in line tries it; if it too is unsuccessful, then the next in line tries it, and so on. When the job is successfully processed or after all
73. There are k players, with player i having valueI mage,I mage. In every period, two of the players play a game. Whoever wins then plays the next game against a randomly chosen one of the other players (including the one who has just lost). Suppose that whenever i and j play, i wins with
72. For a time reversible Markov chain, argue that the rate at which transitions from i to j to k occur must equal the rate at which transitions from k to j to i occur.
71. It follows from Theorem 4.2 that for a time reversible Markov chain Image It turns out that if the state space is finite andI mage for allI mage, then the preceding is also a sufficient condition for time reversibility.(That is, in this case, we need only check Eq. (4.26) for paths from i to i
70. A total of m white and m black balls are distributed among two urns, with each urn containing m balls. At each stage, a ball is randomly selected from each urn and the two selected balls are interchanged. Let denote the number of black balls in urn 1 after the nth interchange.(a) Give the
69. M balls are initially distributed among m urns. At each stage one of the balls is selected at random, taken from whichever urn it is in, and then placed, at random, in one of the otherI mage urns. Consider the Markov chain whose state at any time is the vectorI mage where Image denotes the
*68.(a) Show that the limiting probabilities of the reversed Markov chain are the same as for the forward chain by showing that they satisfy the equations(b) Give an intuitive explanation for the result of part (a).
67. At all times, an urn contains N balls—some white balls and some black balls. At each stage, a coin having probability , of landing heads is flipped. If heads appears, then a ball is chosen at random from the urn and is replaced by a white ball; if tails appears, then a ball is chosen from the
66. For a branching process, calculate when(a) Image.(b) Image.(c) Image.
65. In a branching process havingI mage andI mage, prove that is the smallest positive number satisfying Eq. (4.20).Hint: Let π be any solution ofI mage. Show by mathematical induction thatI mage for all n, and let . In using the induction argue that Image
64. Consider a branching process havingI mage. Show that ifI mage, then the expected number of individuals that ever exist in this population is given byI mage. What if ?
63. For the Markov chain with states 1, 2, 3, 4 whose transition probability matrix P is as specified below findI mage andI mage for Image.Image
*62. Consider the particle from Exercise 57. What is the expected number of steps the particle takes to return to the starting position?What is the probability that all other positions are visited before the particle returns to its starting state?
61. Suppose in the gambler's ruin problem that the probability of winning a bet depends on the gambler's present fortune.Specifically, suppose that is the probability that the gambler wins a bet when his or her fortune is i. Given that the gambler's initial fortune is i, letI mage denote the
60. The following is the transition probability matrix of a Markov chain with states Image IfI mage(a) find the probability that state 3 is entered before state 4;(b) find the mean number of transitions until either state 3 or state 4 is entered.
59. For the gambler's ruin model of Section 4.5.1, letI mage denote the mean number of games that must be played until the gambler either goes broke or reaches a fortune of N, given that he starts withI mage.Show thatI mage satisfies Image Solve these equations to obtain Image
58. In the gambler's ruin problem of Section 4.5.1, suppose the gambler's fortune is presently i, and suppose that we know that the gambler's fortune will eventually reach N (before it goes to 0).Given this information, show that the probability he wins the next gamble is Image Hint: The
57. A particle moves among vertices that are situated on a circle in the following manner. At each step it moves one step either in the clockwise direction with probability p or the counterclockwise direction with probability . Starting at a specified state, call it state 0, let T be the time of
56. Suppose that on each play of the game a gambler either wins 1 with probability p or loses 1 with probability . The gambler continues betting until she or he is either up n or down m. What is the probability that the gambler quits a winner?
55. Consider a population of individuals each of whom possesses two genes that can be either type A or typea. Suppose that in outward appearance type A is dominant and type a is recessive. (That is, an individual will have only the outward characteristics of the recessive gene if its pair is aa.)
54. Consider the Ehrenfest urn model in which M molecules are distributed between two urns, and at each time point one of the molecules is chosen at random and is then removed from its urn and placed in the other one. Let denote the number of molecules in urn 1 after the nth switch and letI mage.
53. Find the average premium received per policyholder of the insurance company of Example 4.29 if for one-third of its clients, andI mage for two-thirds of its clients.
52. A taxi driver provides service in two zones of a city. Fares picked up in zone A will have destinations in zone A with probability 0.6 or in zone B with probability 0.4. Fares picked up in zone B will have destinations in zone A with probability 0.3 or in zone B with probability 0.7. The
51. In Example 4.3, Gary is in a cheerful mood today. Find the expected number of days until he has been glum for three consecutive days.
50. A Markov chain with statesI mage has transition probability matrix Image(a) Give the classes and tell which are recurrent and which are transient.(b) Find Image.(c) Find Image.(d) Find Image.
49. Consider a Markov chain with states having transition probability matrix Image(a) If the chain is currently in state 1, find the probability that after two transitions it will be in state 2.(b) Suppose you receive a reward Image whenever the Markov chain is in state i, Image. Find your long run
48. Consider a Markov chain in steady state. Say that a k length run of zeroes ends at time m if Show that the probability of this event isI mage, where is the limiting probability of state 0.
*47. Let denote an ergodic Markov chain with limiting probabilities . Define the processI mage. That is,I mage keeps track of the last two states of the original chain. IsI mage a Markov chain?If so, determine its transition probabilities and find Image
46. An individual possesses r umbrellas that he employs in going from his home to office, and vice versa. If he is at home (the office)at the beginning (end) of a day and it is raining, then he will take an umbrella with him to the office (home), provided there is one to be taken. If it is not
45. Consider an irreducible finite Markov chain with states .(a) Starting in state i, what is the probability the process will ever visit state j? Explain!(b) Let Image. Compute a set of linear equations that the Image satisfy, Image.(c) If Image, show that Image is a solution to the equations in
44. Suppose that a population consists of a fixed number, say, m, of genes in any generation. Each gene is one of two possible genetic types. If exactly i (of the m) genes of any generation are of type 1, then the next generation will have j type 1 (andI mage type 2) genes with probability Image
43. Each day, one of n possible elements is requested, the ith one with probabilityI mage. These elements are at all times arranged in an ordered list that is revised as follows: The element selected is moved to the front of the list with the relative positions of all the other elements remaining
42. Let A be a set of states, and let be the remaining states.(a) What is the interpretation of Image(b) What is the interpretation of Image(c) Explain the identity Image
*41. Consider a Markov chain with states equal to the nonnegative integers, and suppose its transition probabilities satisfyI mage.AssumeI mage, and letI mage be the probability that the Markov chain is ever in state j. (Note that becauseI mage.) Argue that forI mage IfI mage, findI mage forI mage.
40. A particle moves on 12 points situated on a circle. At each step it is equally likely to move one step in the clockwise or in the counterclockwise direction. Find the mean number of steps for the particle to return to its starting position.
39. Consider the one-dimensional symmetric random walk of Example 4.19, which was shown in that example to be recurrent.Let denote the long-run proportion of time that the chain is in state i.(a) Argue that Image for all i.(b) Show that Image.(c) Conclude that this Markov chain is null recurrent,
38. Capa plays either one or two chess games every day, with the number of games that she plays on successive days being a Markov chain with transition probabilities Capa wins each game with probability p. Suppose she plays two games on Monday.(a) What is the probability that she wins all the games
37. Show that the stationary probabilities for the Markov chain having transition probabilities are also the stationary probabilities for the Markov chain whose transition probabilities are given by for any specified positive integer k.
36. The state of a process changes daily according to a two-state Markov chain. If the process is in state i during one day, then it is in state j the following day with probability , where Every day a message is sent. If the state of the Markov chain that day is i then the message sent is
35. Consider a Markov chain with states 0, 1, 2, 3, 4. SupposeI mage;and suppose that when the chain is in stateI mage, the next state is equally likely to be any of the statesI mage. Find the limiting probabilities of this Markov chain.
34. A flea moves around the vertices of a triangle in the following manner: Whenever it is at vertex i it moves to its clockwise neighbor vertex with probabilityI mage and to the counterclockwise neighbor with probability .(a) Find the proportion of time that the flea is at each of the vertices.(b)
33. Two players are playing a sequence of points, which begin when one of the players serves. Suppose that player 1 wins each point she serves with probability p, and wins each point her opponent serves with probability q. Suppose the winner of a point becomes the server of the next point.(a) Find
*32. Each of two switches is either on or off during a day. On day n, each switch will independently be on with probability For instance, if both switches are on during day , then each will independently be on during day n with probability 3/4. What fraction of days are both switches on? What
31. A certain town never has two sunny days in a row. Each day is classified as being either sunny, cloudy (but dry), or rainy. If it is sunny one day, then it is equally likely to be either cloudy or rainy the next day. If it is rainy or cloudy one day, then there is one chance in two that it will
30. Three out of every four trucks on the road are followed by a car, while only one out of every five cars is followed by a truck. What fraction of vehicles on the road are trucks?
29. An organization has N employees where N is a large number. Each employee has one of three possible job classifications and changes classifications (independently) according to a Markov chain with transition probabilities What percentage of employees are in each classification?
28. Every time that the team wins a game, it wins its next game with probability 0.8; every time it loses a game, it wins its next game with probability 0.3. If the team wins a game, then it has dinner together with probability 0.7, whereas if the team loses then it has dinner together with
*27. Each individual in a population of size N is, in each period, either active or inactive. If an individual is active in a period then, independent of all else, that individual will be active in the next period with probability α. Similarly, if an individual is inactive in a period then,
26. Consider the following approach to shuffling a deck of n cards.Starting with any initial ordering of the cards, one of the numbers Image is randomly chosen in such a manner that each one is equally likely to be selected. If number i is chosen, then we take the card that is in position i and put
25. Each morning an individual leaves his house and goes for a run.He is equally likely to leave either from his front or back door.Upon leaving the house, he chooses a pair of running shoes (or goes running barefoot if there are no shoes at the door from which he departed). On his return he is
24. Consider three urns, one colored red, one white, and one blue. The red urn contains 1 red and 4 blue balls; the white urn contains 3 white balls, 2 red balls, and 2 blue balls; the blue urn contains 4 white balls, 3 red balls, and 2 blue balls. At the initial stage, a ball is randomly selected
23. In a good weather year the number of storms is Poisson distributed with mean 1; in a bad year it is Poisson distributed with mean 3. Suppose that any year's weather conditions depends on past years only through the previous year's condition. Suppose that a good year is equally likely to be
22. LetI mage be the sum of n independent rolls of a fair die. Find Image Hint: Define an appropriate Markov chain and apply the results of Exercise 20.
*21. A DNA nucleotide has any of four values. A standard model for a mutational change of the nucleotide at a specific location is a Markov chain model that supposes that in going from period to period the nucleotide does not change with probabilityI mage, and if it does change then it is equally
20. A transition probability matrix P is said to be doubly stochastic if the sum over each column equals one; that is If such a chain is irreducible and consists ofI mage states , show that the long-run proportions are given by Image
19. For Example 4.4, calculate the proportion of days that it rains.
18. Coin 1 comes up heads with probability 0.6 and coin 2 with probability 0.5. A coin is continually flipped until it comes up tails, at which time that coin is put aside and we start flipping the other one.(a) What proportion of flips use coin 1?(b) If we start the process with coin 1 what is the
17. For the random walk of Example 4.19 use the strong law of large numbers to give another proof that the Markov chain is transient when .Hint: Note that the state at time n can be written asI mage where the Image are independent andI mage. Argue that if , then, by the strong law of large numbers,
*16. Show that if state i is recurrent and state i does not communicate with state j, thenI mage. This implies that once a process enters a recurrent class of states it can never leave that class. For this reason, a recurrent class is often referred to as a closed class.
15. Prove that if the number of states in a Markov chain is M, and if state j can be reached from state i, then it can be reached in M steps or less.
14. Specify the classes of the following Markov chains, and determine whether they are transient or recurrent:
13. Let P be the transition probability matrix of a Markov chain.Argue that if for some positive integerI mage has all positive entries, then so doesI mage, for all integers .
12. For a Markov chain with transition probabilities , consider the conditional probability thatI mage given that the chain started at time 0 in state i and has not yet entered state r by time n, where r is a specified state not equal to either i or m. We are interested in whether this conditional
11. In Example 4.13, give the transition probabilities of theI mage Markov chain in terms of the transition probabilities of the chain.
10. In Example 4.3, Gary is currently in a cheerful mood. What is the probability that he is not in a glum mood on any of the following three days?
*9. In a sequence of independent flips of a coin that comes up heads with probability .6, what is the probability that there is a run of three consecutive heads within the first 10 flips?
8. An urn initially contains 2 balls, one of which is red and the other blue. At each stage a ball is randomly selected. If the selected ball is red, then it is replaced with a red ball with probability .7 or with a blue ball with probability .3; if the selected ball is blue, then it is equally
7. In Example 4.4 suppose that it has rained neither yesterday nor the day before yesterday. What is the probability that it will rain tomorrow?
6. Let the transition probability matrix of a two-state Markov chain be given, as in Example 4.2, by Image Show by mathematical induction that
5. A Markov chain with states , has the transition probability matrix Image IfI mage, findI mage.
4. Let P and Q be transition probability matrices on statesI mage, with respective transition probabilities and . Consider processes andI mage defined as follows:(a) Image. A coin that comes up heads with probability p is then flipped. If the coin lands heads, the subsequent states , are obtained
3. There are k players, with player i having valueI mage,I mage. In every period, two of the players play a game, while the otherI mage wait in an ordered line. The loser of a game joins the end of the line, and the winner then plays a new game against the player who is first in line. Whenever i
2. Each individual in a population independently has a random number of offspring that is Poisson distributed with mean λ. Those initially present constitute the zeroth generation. Offspring of zeroth generation people constitute the first generation; their offspring constitute the second
*1. Three white and three black balls are distributed in two urns in such a way that each contains three balls. We say that the system is in stateI mage, if the first urn contains i white balls. At each step, we draw one ball from each urn and place the ball drawn from the first urn into the
101. For the left skip free random walk of Section 3.6.6,(a) Show, for , that Image.(b) Show that part (a) implies that Image.(c) Explain why part (b) implies the ballot theorem.
100. In the fair gambler's ruin problem of Example 3.16, let denote the probability that, starting with a fortune of i, the gambler's fortune reaches n before 0. Find , .
99. Let N be the number of trials until k consecutive successes have occurred, when each trial is independently a success with probability p.(a) What is Image?(b) Argue that Image(c) Show that
98. For a compound random variable , findI mage.
*97. Use the conditional variance formula to find the variance of a geometric random variable.
Showing 1500 - 1600
of 7136
First
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
Last
Step by Step Answers