New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
elementary probability for applications
Probability An Introduction 2nd Edition Geoffrey Grimmett, Dominic Welsh - Solutions
Example 3.22 Suppose that X has distribution given by P(X = −1) = P(X = 0) = P(X = 1) = 1 3and Y is given by Y =(0 if X = 0, 1 if X 6= 0.
Exercise 3.9 The pair of discrete random variables (X, Y) has joint mass function P(X = i, Y = j ) =(θi+j+1 if i, j = 0, 1, 2, 0 otherwise, for some value of θ. Show that θ satisfies the equationθ + 2θ2 + 3θ3 + 2θ4 + θ5 =
Exercise 3.8 Two cards are drawn at random from a deck of 52 cards. If X denotes the number of aces drawn and Y denotes the number of kings, display the joint mass function of X and Y in the tabular form of Table 3.1.
Similar ideas apply to families X = (X1, X2, . . . , Xn) of discrete random variables on a probability space. For example, the joint mass function of X is the function pX defined by pX(x) = P(X1 = x1, X2 = x2, . . . , Xn = xn)for x = (x1, x2, . . . , xn) ∈ Rn.
Example 3.7 Suppose that X and Y are random variables each taking the values 1, 2, or 3, and that the probability that the pair (X, Y ) equals (x, y) is given in Table 3.1 for all relevant values of x and y.Then, for example, P(X = 3) = P(X = 3, Y = 1) + P(X = 3, Y = 2) + P(X = 3, Y = 3)= 1 6 + 5
2.6 Problems 37 where m ≤ n ≤ N − a + m. Hence, show that aNa − 1 m − 1(N − a)!(N − 1)!N−Xa+m n=m(n − 1)! (N − n)!(n − m)! (N − a + m − n)! = 1, and that the expectation of n is N + 1 a + 1 m. (Oxford 1972M)
10. A population of N animals has had a certain number a of its members captured, marked, and then released. Show that the probability Pn that it is necessary to capture n animals in order to obtain m which have been marked is Pn =a Na − 1 m − 1N − a n − mN − 1 n − 1,
9. The probability of obtaining a head when a certain coin is tossed is p. The coin is tossed repeatedly until n heads occur in a row. Let X be the total number of tosses required for this to happen. Find the expected value of X.
* 8. An ambidextrous student has a left and a right pocket, each initially containing n humbugs.Each time he feels hungry, he puts a hand into one of his pockets and, if it is not empty, he takes a humbug from it and eats it. On each occasion, he is equally likely to choose either the left or right
7. Coupon-collecting problem. There are c different types of coupon, and each coupon obtained is equally likely to be any one of the c types. Find the probability that the first n coupons which you collect do not form a complete set, and deduce an expression for the mean number of coupons you will
2 . (Oxford 1979M)
A fair die having two faces coloured blue, two red and two green, is thrown repeatedly. Find the probability that not all colours occur in the first k throws.Deduce that, if N is the random variable which takes the value n if all three colours occur in the first n throws but only two of the colours
We say that X has the ‘lack-of-memory property’ since, if we are given that X − m > 0, then the distribution of X − m is the same as the original distribution of X. Show that the geometric distribution is the only distribution concentrated on the positive integers with the lack-of-memory
5. Lack-of-memory property. If X has the geometric distribution with parameter p, show that P????X > m + n X > m= P(X > n)for m, n = 0, 1, 2, . . . .
4. For what values of c and α is the function p, defined by p(k) =(ckα for k = 1, 2, . . . , 0 otherwise, a mass function?
3. If X is a discrete random variable and E(X2) = 0, show that P(X = 0) = 1. Deduce that, if var(X) = 0, then P(X = μ) = 1, whenever μ = E(X) is finite.
2. Each toss of a coin results in heads with probability p (> 0). If m(r ) is the mean number of tosses up to and including the r th head, show that m(r ) = p1 + m(r − 1)+ (1 − p)1 + m(r )for r = 1, 2, . . . , with the convention that m(0) = 0. Solve this difference equation by the method
1. If X has the Poisson distribution with parameter λ, show that E????X(X − 1)(X − 2) · · · (X − k)= λk+1 for k = 0, 1, 2, . . . .
What is the
show that the Poisson distribution has variance equal to its mean.
Exercise 2.39 Find E(X) and E(X2) when X has the Poisson distribution with parameter λ, and hence
Exercise 2.38 Show that var(aX +b) = a2 var(X) fora, b ∈ R.
Exercise 2.37 If X has the binomial distribution with parameters n and p = 1 − q, show that E(X) = np, E(X2) = npq + n2 p2, and deduce the variance of X.
2.5 Conditional expectation and the partition theorem 33 Example 2.36 If X has the geometric distribution with parameter p (= 1 − q), the mean of X is E(X) =∞X k=1 kpqk−1=p(1 − q)2 =1 p, and the variance of X is var(X) =∞X k=1 k2 pqk−1 −1 p2 by (2.35).4 Now,∞X k=1 k2qk−1 = q∞X
We note that, by Theorem 2.29, var(X) =X x∈Im X(x − μ)2P(X = x), (2.34)where μ = E(X). A rough motivation for this definition is as follows. If the dispersion of X about its expectation is very small, then |X − μ| tends to be small, giving that var(X) = E(|X−μ|2) is small also; on the
The expectation E(X) of a discrete random variable X is an indication of the ‘centre’ of the distribution of X. Another important quantity associated with X is the ‘variance’ of X, and this is a measure of the degree of dispersion of X about its expectation E(X).Definition 2.32 The variance
Example 2.31 Suppose that X is a random variable with the Poisson distribution, parameterλ, and we wish to find the expected value of Y = eX . Without Theorem 2.29 we would have to find the mass function of Y. Actually this is not difficult, but it is even easier to apply the theorem to find that
Here is an example of Theorem 2.29 in action.32 Discrete random variables
if the last sum converges absolutely. 2 Two simple but useful properties of expectation are as follows.Theorem 2.30 Let X be a discrete random variable and leta, b ∈ R.(a) If P(X ≥ 0) = 1 and E(X) = 0, then P(X = 0) = 1.(b) We have that E(aX +b) = aE(X) + b.Proof (a) Suppose the assumptions
whenever this sum converges absolutely.Intuitively, this result is rather clear, since g(X) takes the value g(x) when X takes the value x, an event which has probability P(X = x). A more formal proof proceeds as follows.Proof Writing I for the image of X, we have that Y = g(X) has image g(I ). Thus
If X is a discrete random variable (on some probability space) and g : R → R, then Y = g(X) is a discrete random variable also. According to the above definition, we need to know the mass function of Y before we can calculate its expectation. The following theorem provides a useful way of
and the expectation of X is often called the expected value or mean of X.3 The reason for requiring absolute convergence in (2.28) is that the image Im X may be an infinite set, and we 3One should be careful to avoid ambiguity in the use (or not) of parentheses. For example, we shall sometimes
Equation (2.28) is often written E(X) =X xxP(X = x) =X xx pX(x),
whenever this sum converges absolutely, in that Px |xP(X = x)| < ∞.
which we call the mean value. This notion of mean value is easily extended to more general distributions as follows.Definition 2.27 If X is a discrete random variable, the expectation of X is denoted by E(X) and defined by E(X) =X x∈Im X xP(X = x) (2.28)
Exercise 2.26 Let X be a discrete random variable having the Poisson distribution with parameter λ, and let Y = | sin( 1 2π X)|. Find the mass function of Y .2.4 Expectation Consider a fair die. If it were thrown a large number of times, each of the possible outcomes 1, 2, . . . , 6 would appear
since there are only countably many non-zero contributions to this sum. Thus, if Y = aX +b with a 6= 0, then P(Y = y) = P(aX + b = y) = P????X = a−1(y − b)for y ∈ R, while if Y = X2, then P(Y = y) =P????X = √y+ P????X = −√yif y > 0, P(X = 0) if y = 0, 0 if y < 0.
Simple examples are if g(x) = ax + b then g(X) = aX + b, if g(x) = cx2 then g(X) = cX2.30 Discrete random variables If Y = g(X), the mass function of Y is given by pY (y) = P(Y = y) = P(g(X) = y)= P(X ∈ g−1(y))=X x∈g−1(y)P(X = x), (2.25)
2.3 Functions of discrete random variables Let X be a discrete random variable on the probability space (,F , P) and let g : R→ R. It is easy to check that Y = g(X) is a discrete random variable on (,F , P) also, defined by Y(ω) = g(X(ω)) for ω ∈ .
Exercise 2.24 If X is a discrete random variable having the geometric distribution with parameter p, show that the probability that X is greater than k is (1 − p)k .
Exercise 2.23 If X is a discrete random variable having the Poisson distribution with parameter λ, show that the probability that X is even is e−λ cosh λ.
2.22 If we carry on tossing the coin in the previous example until the nth head has turned up, then a similar argument shows that, if p ∈ (0, 1), the total number of tosses required has the negative binomial distribution with parameters n and p. △
Let Y be the total number of tosses in this experiment, so that Y(TkH) = k+1 for 0 ≤ k < ∞and Y(T∞) = ∞. If p > 0, then P(Y = k) = P(Tk−1H) = pqk−1 for k = 1, 2, . . . , showing that Y has the geometric distribution with parameter p. △
2.3 Functions of discrete random variables 29 10−5. It may be easier (and not too inaccurate) to use (2.20) rather than (2.19) to calculate probabilities. In this case, λ = np = 10 and so, for example, P(Sn = 10) ≈1 10!(10e−1)10 ≈ 0.125. △vExample 2.21 Suppose that we toss the coin of
This approximation may be useful in practice. For example, consider a single page of the Guardian newspaper containing, say, 106 characters, and suppose that the typesetter flips a coin before setting each character and then deliberately mis-sets this character whenever the coin comes up heads. If
If n is very large and p is very small but np is a ‘reasonable size’ (np = λ, say) then the distribution of Sn may be approximated by the Poisson distribution with parameter λ, as follows. For fixed k ≥ 0, write p = λ/n and suppose that n is large to find that P(Sn = k) =n kpk(1 −
pkqn−k , (2.19)and so Sn has the binomial distribution with parameters n and p.
which is to say that Sn(ω) = X1(ω) + X2(ω) + · · · + Xn(ω). Clearly, Sn is the total number of heads which occur, and Sn takes values in {0, 1, . . . , n} since each Xi equals 0 or 1. Also, for k = 0, 1, . . . , n, we have that P(Sn = k) = P????{ω ∈ : h(ω) = k}=Xω: h(ω)=k P(ω)=n k
Hence, each Xi has the Bernoulli distribution with parameter p. We have derived this fact in a cumbersome manner, but we believe these details to be instructive.Let Sn = X1 + X2 + · · · + Xn,
where ωi is the i th entry in ω. Thus 28 Discrete random variables P(Xi = 0) =Xω: ωi=T ph(ω)qn−h(ω)=Xn−1 h=0 Xω: ωi=T, h(ω)=h phqn−h =Xn−1 h=0n − 1 hphqn−h= q(p + q)n−1 = q and P(Xi = 1) = 1 − P(Xi = 0) = p.
P(ω) = ph(ω)qt (ω), where h(ω) is the number of heads in ω and t (ω) = n−h(ω) is the number of tails. Similarly, for any A ∈ F , P(A) =Xω∈A P(ω).For i = 1, 2, . . . , n, we define the discrete random variable Xi by Xi (ω) =(1 if the i th entry in ω is H, 0 if the i th entry in ω
Example 2.18 Here is an example of some of the above distributions in action. Suppose that a coin is tossed n times and there is probability p that heads appears on each toss. Representing heads by H and tails by T, the sample space is the set of all ordered sequences of length n containing the
As before, note that∞X k=nk − 1 n − 1pnqk−n = pn ∞X l=0n +l − 1 lql where l = k − n= pn ∞X l=0−n l(−q)l= pn(1 − q)−n = 1, using the binomial expansion of (1 − q)−n, see Theorem A.3.
Negative binomial distribution. We say that X has the negative binomial distribution with parameters n and p ∈ (0, 1) if X takes values in {n, n + 1, n + 2, . . . } and P(X = k) =k − 1 n − 1pnqk−n for k = n, n + 1, n + 2, . . . . (2.17)
Geometric distribution. We say that X has the geometric distribution with parameter p ∈(0, 1) if X takes values in {1, 2, 3, . . . } and P(X = k) = pqk−1 for k = 1, 2, 3, . . . . (2.16)As before, note that 2.2 Examples 27∞X k=1 pqk−1 =p 1 − q = 1.
λke−λ for k = 0, 1, 2, . . . . (2.15)Again, this gives rise to a mass function since∞X k=0 1k!λke−λ = e−λ ∞X k=0 1k!λk = e−λeλ = 1.
Poisson distribution. We say that X has the Poisson distribution with parameter λ (> 0) if X takes values in {0, 1, 2, . . . } and P(X = k) =1 k!
Note that (2.14) gives rise to a mass function satisfying (2.6) since, by the binomial theorem, Xn k=0n kpkqn−k = (p + q)n = 1.
Binomial distribution. We say that X has the binomial distribution with parameters n and p if X takes values in {0, 1, . . . , n} and P(X = k) =n kpkqn−k for k = 0, 1, 2, . . . , n. (2.14)
Coin tosses are the building blocks of probability theory. There is a sense in which the entire theory can be constructed from an infinite sequence of coin tosses.
pX (0) = q, pX (1) = p, pX(x) = 0 if x 6= 0, 1.
2.2 Examples Certain types of discrete random variables occur frequently, and we list some of these.Throughout this section, n is a positive integer, p is a number in [0, 1], and q = 1 − p.We never describe the underlying probability space.Bernoulli distribution. This is the simplest non-trivial
Exercise 2.12 For what value of c is the function p, defined by p(k) =c k(k + 1)if k = 1, 2, . . . , 0 otherwise, a mass function?26 Discrete random variables
1 if ω is even, 0 if ω is odd, W(ω) = ω2, for ω ∈ . Determine which of U, V, W are discrete random variables on the probability space.
Exercise 2.11 Let (,F , P) be a probability space in which = {1, 2, 3, 4, 5, 6}, F =∅, {2, 4, 6}, {1, 3, 5},, and let U, V, W be functions on defined by U(ω) = ω, V(ω) =(
Exercise 2.10 If E is an event of the probability space (,F , P) show that the indicator function of E, defined to be the function 1E on given by 1E (ω) =(1 if ω ∈ E, 0 if ω /∈ E, is a discrete random variable.
Exercise 2.9 Show that if F is the power set of , then all functions which map into a countable subset of R are discrete random variables.
Exercise 2.8 If X and Y are discrete random variables on the probability space (,F , P), show that U and V are discrete random variables on this space also, where U(ω) = X(ω) + Y (ω), V(ω) = X(ω)Y(ω), for ω ∈ .
19. There are n socks in a drawer, three of which are red and the rest black. John chooses his socks by selecting two at random from the drawer, and puts them on. He is three times more likely to wear socks of different colours than to wear matching red socks. Find n.For this value of n, what is
i P(Ai ) for all sequences A1, A2, . . . of disjoint events.
* 18. Show that the axiom that P is countably additive is equivalent to the axiom that P is finitely additive and continuous. That is to say, let be a set and F an event space of subsets of . If P is a mapping from F into [0, 1] satisfying(i) P() = 1, P(∅) = 0,(ii) if A, B ∈ F and A ∩ B =
17. A coin is tossed repeatedly; on each toss a head is shown with probability p, or a tail with probability 1− p. The outcomes of the tosses are independent. Let E denote the event that the first run of r successive heads occurs earlier that the first run of s successive tails. Let A denote the
15. Two identical decks of cards, each containing N cards, are shuffled randomly. We say that a k-matching occurs if the two decks agree in exactly k places. Show that the probability that there is a k-matching isπk =1 k!1 −1 1! +1 2! −1 3! + · · · +(−1)N−k(N − k)!!22 Events and
The following morning, the porter was rebuked by the Bursar, so that in the evening she was careful to hang only one key on each hook. But she still only managed to hang them independently and at random. Find an expression for the probability that no key was then hung on its own hook.Find the
(b) One evening, a bemused lodge-porter tried to hang n keys on their n hooks, but only managed to hang them independently and at random. There was no limit to the number of keys which could be hung on any hook. Otherwise, or by using (a), find an expression for the probability that at least one
Find two other relations for sn and mn in terms of cn−1, sn−1, and mn−1, and hence find cn, sn, and mn. (Oxford 1974M)14. (a) Let P(A) denote the probability of the occurrence of an event A. Prove carefully, for events A1, A2, . . . , An, that P[n i=1 Ai =X iP(Ai ) −X i
Let cn be the probability that a particular corner site is occupied after n such independent moves, and let the corresponding probabilities for an intermediate site at the side of the board and for a site in the middle of the board be sn and mn, respectively. Show that 4cn + 8sn + 4mn = 1, n = 0,
13. A square board is divided into 16 equal squares by lines drawn parallel to its sides. A counter is placed at random on one of these squares and is then moved n times. At each of these moves, it can be transferred to any neighbouring square, horizontally, vertically, or diagonally, all such
Use this difference equation to show that u8 = 208 256 .* 12. Any number ω ∈ [0, 1] has a decimal expansionω = 0.x1x2 . . . , and we write fk (ω, n) for the proportion of times that the integer k appears in the first n digits in this expansion. We call ω a normal number if fk (ω, n) → 1 10
11. Show that if un is the probability that n tosses of a fair coin contain no run of 4 heads, then for n ≥ 4 un = 1 2un−1 + 1 4 un−2 + 1 8 un−3 + 1 16un−4.
10. In the circuits in Figure 1.2, each switch is closed with probability p, independently of all other switches. For each circuit, find the probability that a flow of current is possible between A and B.
Fig. 1.2 Two electrical circuits incorporating switches.
9. Two people toss a fair coin n times each. Show that the probability they throw equal numbers of heads is 2n n1 22n.A B A B
8. A fair coin is tossed 3n times. Find the probability that the number of heads is twice the number of tails. Expand your answer using Stirling’s formula.
7. A single card is removed at random from a deck of 52 cards. From the remainder we draw two cards at random and find that they are both spades. What is the probability that the first card removed was also a spade?
6. Urn I contains 4 white and 3 black balls, and Urn II contains 3 white and 7 black balls. An urn is selected at random, and a ball is picked from it. What is the probability that this ball is black? If this ball is white, what is the probability that Urn I was selected?
5. Two fair dice are thrown. Let A be the event that the first shows an odd number, B be the event that the second shows an even number, and C be the event that either both are odd or both are even. Show that A, B, C are pairwise independent but not independent.
This is sometimes called Bonferroni’s inequality, but the term is not recommended since it has multiple uses.
1. A fair die is thrown n times. Show that the probability that there are an even number of sixes is 1 21 + ( 2 3 )n. For the purpose of this question, 0 is an even number.2. Does there exist an event space containing just six events?3. Prove Boole’s inequality:P[n i=1 Ai ≤Xn i=1
The following comparison of surgical procedures is taken from Charig et al. (1986). Two treatments are considered for kidney stones, namely open surgery (abbreviated to OS) and percutaneous nephrolithotomy (PN). It is reported that OS has a success rate of 78% (= 273/350) and PN a success rate of
A coin is tossed 2n times. What is the probability of exactly n heads? How does your answer behave for large n?Solution The sample space is the set of possible outcomes. It has 22n elements, each of which is equally likely. There are????2n nways to throw exactly n heads. Therefore, the answer is
(e) If you choose first, what is the probability that you survive, given that your sister survives
(d) Is it in your best interests to persuade your sister to choose first?
(c) If you choose first and die, what is the probability that your sister survives?
(b) If you choose first and survive, what is the probability that your sister survives?
(a) If you choose before your sister, what is the probability that you will survive?
You are travelling on a train with your sister. Neither of you has a valid ticket, and the inspector has caught you both. He is authorized to administer a special punishment for this offence.He holds a box containing nine apparently identical chocolates, three of which are contaminated with a
A biased coin shows heads with probability p = 1 − q whenever it is tossed. Let un be the probability that, in n tosses, no two heads occur successively. Show that, for n ≥ 1, un+2 = qun+1 + pqun , and find un by the usual method (described in Appendix B) when p = 2 3
2 , and you pick a ball at random from the chosen urn. Given the ball is black, what is the probability you picked Urn I?
Here are two routine problems about balls in urns. You are presented with two urns. Urn I contains 3 white and 4 black balls, and Urn II contains 2 white and 6 black balls.(a) You pick a ball randomly from Urn I and place it in Urn II. Next you pick a ball randomly from Urn II. What is the
Showing 1400 - 1500
of 3340
First
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
Last
Step by Step Answers