New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
introduction to probability statistics
Introduction To Probability Models 6th Edition Sheldon M. Ross - Solutions
Compare the Poisson approximation with the correct binomial probability for the following cases: (i) P(X 2) when n = 8, p = 0.1. (ii) P(X = 9) when n=10, p = 0.95. (iii) P(X = 0) when n = 10, p = 0.1. (iv) PIX 4) when n = 9, p = 0.2.
If you buy a lottery ticket in 50 lotteries, in each of which your chance of winning a prize is do, what is the (approximate) probability that you will win a prize (a) at least once, (b) exactly once, (c) at least twice?
Suppose that two teams are playing a series of games, each of which is independently won by team A with probability p and by team B with probability 1 p. The winner of the series is the first team to win 4 games. Find the expected number of games that are played, and evaluate this quantity when p =
Consider the case of arbitrary p in Exercise 29.Compute the expected number of changeovers.
Suppose that each coupon obtained is, independent of what has been previously obtained, equally likely to be any of m different types. Find the expected number of coupons one needs to obtain in order to have at least one of each type. Hint: Let X be the number needed. It is useful to represent X by
An urn contains n + m balls, of which n are red and m are black. They are withdrawn from the urn, one at a time and without replacement. Let X be the number of red balls removed before the first black ball is chosen. We are interested in determining E[X]. To obtain this quantity, number the red
In Exercise 43, let Y denote the number of red balls chosen after the first but before the second black ball has been chosen. (a) Express Y as the sum of n random variables, each of which is equal to either 0 or 1.(b) Find E[Y]. (c) Compare E[Y] to E[X] obtained in Exercise 43.(d) Can you explain
A total of r keys are to be put, one at a time, in k boxes, with each key independently being put in box i with probability pi, E-1 P 1.Each time a key is put in a nonempty box, we say that a collision occurs. Find the expected number of collisions.
Consider three trials, each of which is either a success or not. Let X denote the number of successes. Suppose that E[X] =1.8. (a) What is the largest possible value of P(X = 3)? (b) What is the smallest possible value of P(X = 3)? In both cases, construct a probability scenario that results in P(X
If X is uniformly distributed over (0, 1), calculate E[X2].
Prove that E[X] (E[X]). When do we have equality?
Let c be a constant. Show that (i) Var(CX) = c Var(X). (ii) Var(c+X) Var(X).
A coin, having probability p of landing heads, is flipped until the head appears for the rth time. Let N denote the number of flips required. Calculate E[N]. Hint: There is an easy way of doing this. It involves writing N as the sum of r geometric random variables.
Calculate the variance of the Bernoulli random variable.
(a) Calculate E[X] for the maximum random variable of Exercise 37.(b) Calculate E(X) for X as in Exercise 33.(c) Calculate E[X] for X as in Exercise 34.
If X is uniform over (0, 1), calculate E[X"] and Var(X").
Let X and Y each take on either the value 1 or -1. Let p(1, 1) P(X1,Y=1], p(1, 1) P(X 1, Y= -1), p(-1, 1) = P(X=-1, Y = 1), p(-1,-1) P(X = -1, Y=-1) Suppose that E[X] E[Y] = 0.Show that (a) p(1, 1) (b) p(1,-1) = p(-1,-1) p(-1, 1) Let p =2p(1, 1). Find (c) Var(X) (d) Var(Y) (e) Cov(X, Y)
Let X be a positive random variable having density function f(x). If f(x) c for all x, show that, for a > 0, P(X >a) 1-ac.
Calculate, without using moment generating functions, the variance of a binomial random variable with parameters n and p.
Suppose that X and Y are independent binomial random variables with parameters (n, p) and (m, p). Argue probabilistically (no computations necessary) that X + Y is binomial with parameters (n + m,p).
Suppose that X and Y are independent continuous random variables. Show that P(X Y1 - Fx (v)fy (y) dy
Let X1, X2, X,, and X, be independent continuous random variables with a common distribution function F and let p = P(X X 0 otherwise Calculate the moment generating function, E[X], and Var(X).
Calculate the moment generating function of a geometric random variable.
Show that the sum of independent identically distributed exponential random variables has a gamma distribution.
Use Chebyshev's inequality to prove the weak law of large numbers. Namely, if X1, X2,... are independent and identically distributed with mean and variance then, for any > 0, P n - 0 as n
Suppose that X is a random variable with mean 10 and variance 15.What can we say about P(5 < x < 15]?
Let X1, X2, Xo be independent Poisson random variable with mean 1.(i) Use the Markov inequality to get a bound on P(X, ++ X10 15]. (ii) Use the central limit theorem to approximate P(X + + X10 15].
If X is normally distributed with mean 1 and variance 4, use the tables to find P12 < x < 3).
Show that lim e" 38 Hint: Let X, be Poisson with mean n. Use the central limit theorem to show that PIX, n).
Let X denote the number of white balls selected when k balls are chosen at random from an urn containing n white and m black balls. (i) Compute P(X = i). (ii) Let, for i = 1, 2, ..., k; j = 1, 2,..., n, X = -{ Y; if the ith ball selected is white otherwise if the /th white ball is selected
Show that Var(X) = 1 when X is the number of men that select their own hats in Example 2.31.
For the multinomial distribution (Exercise 17), let N, denote the number of times outcome i occurs. Find (i) E[Ni] (ii) Var(N) (iii) Cov(N, NJ) (iv) Compute the expected number of outcomes which do not occur.
Let X1, X2,... be a sequence of independent identically distributed continuous random variables. We say that a record occurs at time n if X> max(X1, X-1). That is, X, is a record if it is larger than each of XX-1 Show (i) Pla record occurs at time n] = 1/n (ii) E[number of records by time n]
Let (1, 1) denote the joint moment generating function of X, X. (a) Explain how the moment generating function of X,, ox,(,), can be obtained from (...). (b) Show that X, ..., X, are independent if and only if (!) = x,(11) ... x, (In)
Three white and three black balls are distributed in two urns in such a way that each contains three balls. We say that the system is in state i, i= 0, 1, 2, 3, if the first urn contains i white balls. At each step, we draw one ball from each urn and place the ball drawn from the first urn into the
Suppose that whether or not it rains today depends on previous weather conditions through the last three days. Show how this system may be analyzed by using a Markov chain. How many states are needed?
In Exercise 2, suppose that if it has rained for the past three days, then it will rain today with probability 0.8; if it did not rain for any of the past three days, then it will rain today with probability 0.2; and in any other case the weather today will, with probability 0.6, be the same as the
Consider a process (X,, n = 0, 1,...] which takes on the values 0, 1, or 2.Suppose P{X+1=j|X, i, X-in-1 X = i0] = where PoP 1, PH, when n is even when n is odd 0, 1, 2.Is (X, n 0) a Markov chain? If not, then show how, by enlarging the state space, we may transform it into a Markov chain.
Let the transition probability matrix of a two-state Markov chain be given, as in Example 4.2, by P 1- P = 1-p P Show by mathematical induction that p(x) +(2p 1)-(2p - 1)" -
In Example 4.4 suppose that it has rained neither yesterday nor the day before yesterday. What is the probability that it will rain tomorrow?
Suppose that coin 1 has probability 0.7 of coming up heads, and coin 2 has probability 0.6 of coming up heads. If the coin flipped today comes up heads, then we select coin 1 to flip tomorrow, and if it comes up tails, then we select coin 2 to flip tomorrow. If the coin initially flipped is equally
Specify the classes of the following Markov chains, and determine whether they are transient or recurrent: 0 0 0 0 1 P = 0 P 0 0 0 0 P, 000 000 000 0 0 0 1 00 0 0 1 0 0 0 0 00 000 P4 = 0 0 1 00 0 0 0 1 0 0 0 0
Prove that if the number of states in a Markov chain is M, and if state j can be reached from state i, then it can be reached in M steps or less.
Show that if state i is recurrent and state i does not communicate with state, then P, 0.This implies that once a process enters a recurrent class of states it can never leave that class. For this reason, a recurrent class is often referred to as a closed class.
For the random walk of Example 4.13 use the strong law of large numbers to give another proof that the Markov chain is transient when pt. Hint: Note that the state at time n can be written as 27, Y, where the Y's are independent and P(Y, 1) p 1-P{Y=-1). Argue that if p>, then, by the strong law of
Coin 1 comes up heads with probability 0.6 and coin 2 with probability 0.5. A coin is continually flipped until it comes up tails, at which time that coin is put aside and we start flipping the other one. (a) What proportion of flips use coin 1? (b) If we start the process with coin 1 what is the
For Example 4.4, calculate the proportion of days that it rains.
A transition probability matrix P is said to be doubly stochastic if the sum over each column equals one; that is, P = 1, for all j If such a chain is irreducible and aperiodic and consists of M + 1 states 0, 1, ..., M, show that the limiting probabilities are given by M+ j = 0, 1, ..., M *15. A
Let Y be the sum of n independent rolls of a fair die. Find lim P(Y, is a multiple of 13] 11-200 Hint: Define an appropriate Markov chain and apply the results of Exercise 14.
Each morning an individual leaves his house and goes for a run. He is equally likely to leave either from his front or back door. Upon leaving the house, he chooses a pair of running shoes (or goes running barefoot if there are no shoes at the door from which he departed). On his return he is
Consider the following approach to shuffling a deck of n cards. Starting with any initial ordering of the cards, one of the numbers 1, 2,...,n is randomly chosen in such a manner that each one is equally likely to be selected. If number i is chosen, then we take the card that is in position and put
Determine the limiting probabilities , for the model presented in Exercise 1.Give an intuitive explanation of your answer.
For a series of dependent trials the probability of success on any trial is (k+1)/(k + 2) where k is equal to the number of successes on the previous two trials. Compute lim, P(success on the nth trial].
An organization has N employees where N is a large number. Each employee has one of three possible job classifications and changes classifications (independently) according to a Markov chain with transition probabilities 0.7 0.2 0.1] 0.2 0.6 0.2 0.1 0.4 0.5 What percentage of employees are in each
Three out of every four trucks on the road are followed by a car, while only one out of every five cars is followed by a truck. What fraction of vehicles on the road are trucks?
A certain town never has two sunny days in a row. Each day is classified as being either sunny, cloudy (but dry), or rainy. If it is sunny one day, then it is equally likely to be either cloudy or rainy the next day. If it is rainy or cloudy one day, then there is one chance in two that it will be
Each of two switches is either on or off during a day. On day n, each switch will independently be on with probability [1+ number of on switches during day n - 11/4 For instance, if both switches are on during day 1, then each will independently be on during day n with probability 3/4. What
A professor continually gives exams to her students. She can give three possible types of exams, and her class is graded as either having done well or badly. Let p, denote the probability that the class does well on a type i exam, and suppose that p = 0.3, p2 = 0.6, and p3 = 0.9. If the class does
A flea moves around the vertices of a triangle in the following manner: Whenever it is at vertex i it moves to its clockwise neighbor vertex with probability p, and to the counterclockwise neighbor with probability q 1 P, 1, 2, 3.(a) Find the proportion of time that the flea is at each of the
Consider a Markov chain with states 0, 1, 2, 3, 4.Suppose Po,4 = 1; and suppose that when the chain is in state i, i > 0, the next state is equally likely to be any of the states 0, 1, ..., i - 1.Find the limiting probabilities of this Markov chain.
Let , denote the long-run proportion of time a given Markov chain is in state i. (a) Explain why, is also the proportion of transitions that are into state i as well as being the proportion of transitions that are from state i. (b),P represents the proportion of transitions that satisfy what
Let A be a set of states, and let A be the remaining states. (a) What is the interpretation of ? EA JEA (b) What is the interpretation of (c) Explain the identity ? LEA JEA - EA je A ie jed
Each day, one of n possible elements is requested, the ith one with probability P, i 1, P = 1.These elements are at all times arranged in an ordered list which is revised as follows: The element selected is moved to the front of the list with the relative positions of all the other elements
Suppose that a population consists of a fixed number, say, m, of genes in any generation. Each gene is one of two possible genetic types. If any generation has exactly i (of its m) genes being type 1, then the next generation will have j type 1 (and m-j type 2) genes with probability m-im-j m
Consider an irreducible finite Markov chain with states 0, 1,..., N. (a) Starting in state i, what is the probability the process will ever visit state /? Explain! (b) Let x, P(visit state N before state 0 start in ]. Compute a set of linear equations which the x, satisfy, 0, 1, ..., N. (c) If, jpy
An individual possesses r umbrellas which he employs in going from his home to office, and vice versa. If he is at home (the office) at the beginning (end) of a day and it is raining, then he will take an umbrella with him to the office (home), provided there is one to be taken. If it is not
Let (X, n 0] denote an ergodic Markov chain with limiting probabilities. Define the process (Y, n 1] by Y = (x-1, X). That is, Y, keeps track of the last two states of the original chain. Is [Y, n 1) a Markov chain? If so, determine its transition probabilities and find lim P{Y, = (I,J))
Verify the transition probability matrix given in Example 4.18.1
Let P) and p2) denote transition probability matrices for ergodic Markov chains having the same state space. Let z' and denote the stationary (limiting) probability vectors for the two chains. Consider a process defined as follows:(i) X = 1.A coin is then flipped and if it comes up heads, then the
A fair coin is continually flipped. Compute the expected number of flips until the following patterns appear: (a) HHTTHT *(b) HHTTHH (c) HHTHHT
Consider the Ehrenfest urn model in which M molecules are distributed among two urns, and at each time point one of the molecules is chosen at random and is then removed from its urn and placed in the other one. Let X, denote the number of molecules in urn 1 after the nth switch and let , =E[X].
Consider a population of individuals each of whom possesses two genes which can be either type A or typea. Suppose that in outward appearance type A is dominant and type a is recessive. (That is, an individual will only have the outward characteristics of the recessive gene if its pair is aa.)
Suppose that on each play of the game a gambler either wins I with probability p or loses 1 with probability 1 p. The gambler continues betting until she or he is either winning n or losing m. What is the probability that the gambler quits a winner?
A particle moves among n + 1 vertices that are situated on a circle in the following manner: At each step it moves one step either in the clockwise direction with probability p or the counterclockwise direction with probability q 1-p. Starting at a specified state, call it state 0, let T be the
In the gambler's ruin problem of Section 4.5.1, suppose the gambler's fortune is presently i, and suppose that we know that the gambler's fortune will eventually reach N (before it goes to 0). Given this information, show that the probability he wins the next gamble is P[1-(q/p)+] 1- (q/p) i+1 2i
For the gambler's ruin model of Section 4.5.1, let M, denote the mean number of games that must be played until the gambler either goes broke or reaches a fortune of N, given that he starts with i, i = 0, 1, ..., N. Show that M, satisfies M = MN = 0; M = 1 + pM+1+qM;-1, i=1,...,N-1
Solve the equations given in Exercise 43 to obtain M, i(N-i), if p i q-p N 1-(q/p) g-pl-(q/p) if p
In Exercise 15, (a) what is the expected number of steps the particle takes to return to the starting position? (b) what is the probability that all other positions are visited before the particle returns to its starting state?
For the Markov chain with states 1, 2, 3, 4 whose transition probability matrix P is as specified below find fax and s for i = 1, 2, 3.0.4 0.2 0.1 0.37 0.1 0.5 0.2 0.2 P = 0.3 0.4 0.2 0.1 Lo 0 0 1 ]
Consider a branching process having
In a branching process having X = 1 and > 1, prove that o is the smallest positive number satisfying Equation (4.15). Hint: Let be any solution of 'P). Show by mathematical induction that P(X = 0) for all n, and let no. In using the induction argue that P(X = 0) = (P{X-1 = 0))/P; 1-0
For a branching process, calculate when (a) Po=, P = (b) Po, P, P = + (c) Po, P, P = }
At all times, an urn contains N balls-some white balls and some black balls. At each stage, a coin having probability p, 0(a) Is X, n 0) a Markov chain? If so, explain why. (b) What are its classes? What are their periods? Are they transient or recurrent? (c) Compute the transition probabilities P.
(a) Show that the limiting probabilities of the reversed Markov chain are the same as for the forward chain by showing that they satisfy the equations (b) Give an intuitive explanation for the result of part (a).
M balls are initially distributed among m urns. At each stage one of the balls is selected at random, taken from whichever urn it is in, and then placed, at random, in one of the other M-1 urns. Consider the Markov chain whose state at any time is the vector (n,,..., nm) where n, denotes the number
It follows from Theorem 4.2 that for a time reversible Markov chain PyPjk PPPP, for all i, j, k It turns out that if the state space is finite and P > 0 for all i, j, then the preceding is also a sufficient condition for time reversibility. (That is, in this case, we need only check Equation (4.26)
For a time reversible Markov chain, argue that the rate at which transitions from i to j to k occur must equal the rate at which transitions from k toj to occur.
Show that the Markov chain of Exercise 23 is time reversible.
A group of n processors are arranged in an ordered list. When a job arrives, the first processor in line attempts it; if it is unsuccessful, then the next in line tries it; if it too is unsuccessful, then the next in line tries it, and so on. When the job is successfully processed or after all
A Markov chain is said to be a tree process if (i) P>0 whenever P > 0.(ii) for every pair of states i and j, ij, there is a unique sequence of distinct states iio, P>0, - j such that k = 0, 1,...,n-1 In other words, a Markov chain is a tree process if for every pair of distinct states and there is
On a chessboard compute the expected number of plays it takes a knight, starting in one of the four corners of the chessboard, to return to its initial position if we assume that at each play it is equally likely to choose any of its legal moves. (No other pieces are on the board.) Hint: Make use
In a Markov decision problem, another criterion often used, different than the expected average return per unit time, is that of the expected discounted return. In this criterion we choose a numbera, 0 < a < 1, and try to choose a policy so as to maximize E[o a'R(X,, a)]. (That is, rewards at time
(a) From the results of Section 3.6.3 we can conclude that there are n+m-1 m-1 - nonnegative integer valued solutions of the equation xxmn. Prove this directly. (b) How many positive integer valued solutions of x ++ x = n are there? Hint: Let y, x,- 1.(c) For the Bose-Einstein distribution, compute
Consider the random graph of Section 3.6.2 when n = 5.Compute the probability distribution of the number of components and verify your solution by using it to compute E[C] and then comparing your solution with E[C]- k=1 (k-1)! Sk
In the list problem, when the P, are known, show that the best ordering (best in the sense of minimizing the expected position of the element requested) is to place the elements in decreasing order of their probabilities. That is, if P> P>> P, show that 1, 2, ..., is the best ordering.
In the list example of Section 3.6.1 suppose that the initial ordering at time 0 is determined completely at random; that is, initially all n! permutations are equally likely. Following the front of the line rule, compute the expected position of the element requested at time . Hint: To compute
An urn contains n balls, with ball i having weight w, i = 1,..., n. The balls are withdrawn from the urn one at a time according to the follow- ing scheme: When S is the set of balls that remains, ball i, ie S, is the next ball withdrawn with probability w/Ejes w). Find the expected number of balls
A coin that comes up heads with probability p is flipped n consecutive times. What is the probability that starting with the first flip there are always more heads than tails that have appeared?
An urn contains n white and m black balls which are removed one at a time. If n> m, show that the probability that there are always more white than black balls in the urn (until, of course, the urn is empty) equals (nm)/(n+m). Explain why this probability is equal to the probability that the set of
In the ballot problem (Example 3.23), compute PIA is never behind].
Showing 5500 - 5600
of 7136
First
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
Last
Step by Step Answers