New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
elementary probability for applications
Probability An Introduction 2nd Edition Geoffrey Grimmett, Dominic Welsh - Solutions
Exercise 6.54 Let X and Y be independent random variables, each having the normal distribution with mean μ and variance σ 2. Find the joint density function of U = X − Y and V = X + Y. Are U and V independent?
Suppose that X and Y have joint density function f (x, y) =(ce−x−y if 0 < x < y, 0
Find the value of the constant c and the joint distribution function of X and Y .Exercise 6.26 Random variables X and Y have joint density function f (x, y) =(e−x−y if x, y > 0, 0 otherwise
Exercise 6.25 Random variables X and Y have joint density function f (x, y) =(c(x2 + 1 2 xy) if 0 < x < 1, 0 < y < 2, 0 otherwise.
14. The random variable X is uniformly distributed on the interval [0, 1]. Find the distribution and probability density function of Y, where Y =3X 1 − X.(Cambridge
* 13. A unit stick is broken at n random places, each uniform on [0, 1], and different breaks are chosen independently. Show that the resulting n +1 substicks can form a closed polygon with probability 1 − (n + 1)/2n.
* 12. Buffon–Laplace needle. Leta, b > 0. The Cartesian plane is ruled with two sets of parallel lines of the form x = ma and y = nb for integers m and n. A needle of length ℓ (< min{a, b})is dropped at random. Show that the probability it intersects some line is ℓ(2a+2b−ℓ)/(πab).
11. William Tell is a very bad shot. In practice, he places a small green apple on top of a straight wall which stretches to infinity in both directions. He then takes up position at a distance of one perch from the apple, so that his line of sight to the target is perpendicular to the wall. He now
10. Let X have the exponential distribution with parameter 1. Find the density function of Y = (X − 2)/(X + 1).
9. The random variable X′ is said to be obtained from the random variable X by truncation at the point a if X′(ω) =(X(ω) if X(ω) ≤ a, a if X(ω) > a.Express the distribution function of X′ in terms of the distribution function of X.
8. Use the result of Problem 5.8.7 to show that E(g(X)) =Z∞−∞g(x) fX (x) dx whenever X and g(X) are continuous random variables and g : R →[0,∞).
* 6. Let F be a distribution function, and let X be a random variable which is uniformly distributed on the interval (0, 1). Let F−1 be the inverse function of F, defined by F−1(y) = inf{x : F(x) ≥ y}.
5. Let X be a random variable whose distribution function F is a continuous function. Show that the random variable Y, defined by Y = F(X), is uniformly distributed on the interval (0, 1).
4. If X has the normal distribution with mean 0 and variance 1, find the density function of Y = |X|, and find the mean and variance of Y.
3. The random variable X has density function proportional to g(x), where g is a function satisfying g(x) =(|x|−n if |x| ≥ 1, 0 otherwise, and n (≥ 2) is an integer. Find and sketch the density function of X, and determine the values of n for which both the mean and variance of X exist.
2. Let X be a random variable with the Poisson distribution, parameter λ. Show that, for w = 1, 2, 3, . . . , P(X ≥ w) = P(Y ≤ λ), where Y is a random variable having the gamma distribution with parameters w and 1.
2 ce−c|x| for x ∈ R, where c (> 0) is a parameter of the distribution. Show that the mean and variance of this distribution are 0 and 2c−2, respectively.
1. The bilateral (or double) exponential distribution has density function f (x) = 1
f (x) = cx(1 − x) for 0 ≤ x ≤ 1.Determine c,
Exercise 5.68 The random variable X has density function
76 Distribution functions and density functions
Exercise 5.67 Show that a random variable with density function f (x) =1π√x(1 − x)if 0 < x < 1, 0 otherwise, has mean 1 2 .
1 + M2 = l(M, N), say, and the limit of l(M, N) as M, N →∞depends on the way in which M and N approach∞. If M → ∞ and N → ∞ in that order, then l(M, N) → −∞, while if the limit is taken in the other order, then l(M, N) → ∞. Hence the Cauchy distribution does not have a mean
Exercise 5.54 Let X be a random variable with the exponential distribution, parameter λ. Find the density function of(a) A = 2X + 5,(b) B = eX ,(c) C = (1 + X)−1,(d) D = (1 + X)−2.Exercise 5.55 Show that if X has the normal distribution with parameters 0 and 1, then Y = X2 has theχ2
dy [g−1(y)] for y ∈ R. (5.52)Formulae (5.51) and (5.52) rely heavily on the monotonicity of g. Other cases are best treated on their own merits, and actually there is a lot to be said for using the method of the next example always, rather than taking recourse in the general results
5.5 Functions of random variables Let X be a random variable on (,F , P) and suppose that g : R → R. Then Y = g(X) is a mapping from into R, defined by Y (ω) = g[X(ω)] for ω ∈ . Actually, Y is not generally a random variable since it need not satisfy condition (5.1). It turns out,
5.5 Functions of random variables 71 Exercise 5.48 Show that the density function f (x) =1π√x(1 − x)if 0 < x < 1, 0 otherwise, has distribution function with the form F(x) = c sin−1 √x if 0 < x < 1, and find the constant c.
Exercise 5.46 Show that the gamma function Ŵ(w) satisfies Ŵ(w) = (w − 1)Ŵ(w − 1) for w > 1, and deduce that Ŵ(n) = (n − 1)! for n = 1, 2, 3, . . . .Exercise 5.47 Let I =Z∞−∞e−x2 dx.By changing variables to polar coordinates, show that I 2 =ZZ R2 e−x2−y2 dx dy =Z 2πθ=0
Exercise 5.45 For what values of its parameters is the gamma distribution also an exponential distribution?
An outline of the proof of this may be found in Exercise 5.47.The constant terms in (5.36)–(5.44) have been chosen solely so that the resulting functions integrate to 1. For example, it is clear that the function g(x) =1 1 + x2 for x ∈ R, is not a density function since Z∞−∞g(x) dx = π,
It is not always a trivial task to show that these functions are actually density functions.The condition (5.34) of non-negativity is no problem, but some care is required in checking that the functions integrate to 1. For example, to check this for the function given in (5.38) we require the
2 , but we list the distribution separately here because of its common occurrence in statistics.70 Distribution functions and density functions The above list is a dull compendium of some of the commoner density functions, and we do not expect it to inspire the reader in this form. It is difficult
A comparison of (5.44) with (5.40) shows that the χ2 n distribution is the same as the gamma distribution with parameters 1 2n and 1
The Cauchy distribution has density function f (x) =1π(1 + x2)for x ∈ R. (5.39)The gamma distribution with parameters w (> 0) and λ (> 0) has density function f (x) =1Ŵ(w)λwxw−1e−λx if x > 0, 0 if x ≤ 0,(5.40)where Ŵ(w) is the gamma function, defined byŴ(w) =Z∞0
The normal (or Gaussian) distribution with parameters μ and σ2, sometimes written as N(μ, σ2), has density function f (x) =1√2πσ2 exp−1 2σ2(x − μ)2for x ∈ R. (5.38)
5.4 Some common density functions 69 The exponential distribution with parameter λ > 0 has density function f (x) =(λe−λx if x > 0, 0 if x ≤ 0.(5.37)
Exercise 5.33 Find the distribution function of the so-called ‘extreme value’ density function f (x) = exp(−x − e−x ) for x ∈ R.5.4 Some common density functions It is fairly clear that any function f which satisfies f (x) ≥ 0 for x ∈ R (5.34)and Z∞−∞f (x) dx = 1 (5.35)is the
Exercise 5.32 If X has distribution function F(x) =1 2(1 + x2)for −∞ < x ≤ 0, 1 + 2x2 2(1 + x2)for 0 < x < ∞, show that X is continuous and find its density function.
Find the distribution function of X.Exercise 5.31 If X has density function f (x) = 1 2 e−|x| for x ∈ R, find the distribution function of X. This is called the bilateral (or double) exponential distribution.
There are many random variables which are neither discrete nor continuous, and we shall come across some of these later.68 Distribution functions and density functions Exercise 5.30 A random variable X has density function f (x) =(2x if 0 < x < 1, 0 otherwise.
To recap, all random variables have a distribution function. In addition, discrete random variables have a mass function, and continuous random variables have a density function.
The first equality here cannot be justified without an appeal to the continuity of P, Theorem 1.54. For the second part of the theorem, if a ≤b, then P(a ≤ X ≤b) = P(a < X ≤b) by (5.28)= FX (b) − FX (a) by (5.10)=Z b afX (u) du. 2
fX (u) du fora, b ∈ R with a ≤b. (5.29)Proof We argue as follows:P(X = x) = limǫ↓0 P(x − ǫ < X ≤ x)= limǫ↓0FX (x) − FX (x − ǫ)by (5.10)= limǫ↓0 Z x x−ǫfX (u) du by (5.21)= 0.
Theorem 5.27 If X is continuous with density function fX , then P(X = x) = 0 for x ∈ R, (5.28)P(a ≤ X ≤b) =Z b a
o the true analogy is not between a density function fX (x) and a mass function pY (x) but instead between fX (x)δx and pY (x). This is borne out by comparing (5.25) with (2.6): values of the mass function are replaced by fX (x)δx, and the summation (since, for discrete random variables, only
5.3 Continuous random variables 67 following sense. If δx is small and positive, then, roughly speaking, the probability that X is‘near’ x satisfies P(x < X ≤ x + δx) = F(x + δx) − F(x) by (5.10)=Z x+δx xfX (u) du by (5.21)≈ fX (x)δx for small δx. (5.26)
where the parentheses contain the corresponding properties, (2.5) and (2.6), of a mass function pY . However, this analogy can be dangerous, since fX (x) is not a probability and may well even exceed 1 in value. On the other hand, fX (x) is indeed a ‘measure’ of probability in the 4More
Density functions serve continuous random variables in very much the same way as mass functions serve discrete randomvariables, and it is not surprising that the general properties of density functions and mass functions are very similar. For example, it is clear that the density function fX of X
(5.23) is adequate, and the reader of a text at this level should seldom get into trouble if he or she uses (5.23) when finding density functions of continuous random variables.
Provided that X is a continuous random variable and FX is well behaved in (5.21), we can take fX (x) =d dx FX (x) if this derivative exists at x, 0 otherwise,(5.23)as the density function of X.We shall normally do this, although we should point out that there are some difficulties
. By working with the function H(s) = GN (1 − s) or otherwise, deduce that N has a Poisson distribution.You may assume that1 + (x/n) + o(n−1)n→ex as n → ∞. (Cambridge 2002)Example 5.22 If X has the exponential distribution with parameter λ, then FX (x) = (0 if x ≤ 0, 1 − e−λx if
(c) Let p = 1 2 and suppose F and S are independent. [You are given nothing about the distribution of N.] Show that GN (s) = GN???? 1 2 [1 + s]2
(b) Suppose N has the Poisson distribution with parameter μ. Show that F has the Poisson distribution with parameter μp, and that F and S are independent.
(a) Show that GF (s) = GN ( ps + 1 − p). [You should present a clear statement of any general result used.]
11. There is a random number N of foreign objects in my soup, with mean μ and finite variance.Each object is a fly with probability p, and otherwise a spider; different objects have independent types. Let F be the number of flies and S the number of spiders.
10. Define the mean value of a discrete random variable and the probability generating function φ.Show that the mean value is φ′(1). If φ(s) has the form p(s)/q(s) show that the mean value is(p′(1) − q′(1))/q(1).Two duellists, A and B, fire at each other in turn until one hits the other.
More generally, suppose that there are tokens of m different colours, all equally likely. Let Y be the random variable which takes the value j when I first obtain a full set, of at least one token of each colour, when I open my j th packet. Find the generating function of Y , and show that its
9. Coupon-collecting problem. Each packet of a certain breakfast cereal contains one token, coloured either red, blue, or green. The coloured tokens are distributed randomly among the packets, each colour being equally likely. Let X be the random variable which takes the value j when I find my
* 8. A fair coin is tossed a random number N of times, giving a total of X heads and Y tails. You showed in Problem 3.6.14 that X and Y are independent if N has the Poisson distribution. Use generating functions to show that the converse is valid too: if X and Y are independent and the generating
7. Let X and Y be independent random variables having Poisson distributions with parametersλ and μ, respectively. Prove that X + Y has a Poisson distribution and that var(X + Y ) = var(X) + var(Y ). Find the conditional probability P(X = k | X + Y = n) for 0 ≤ k ≤ n, and hence show that the
(c) the probability that X is divisible by 3.(Oxford 1980M)
(b) the probability that X is even,
(a) the mean and variance of X,
6. An unfair coin is tossed n times, each outcome is independent of all the others, and on each toss a head is shown with probability p. The total number of heads shown is X. Use the probability generating function of X to find
(b) the tree had n flowers, given that it produces r ripe fruits.(Oxford 1982M)
(a) the tree produces r ripe fruits,
5. Each year a tree of a particular type flowers once, and the probability that it has n flowers is(1 − p) pn, n = 0, 1, 2, . . . , where 0 < p < 1. Each flower has probability 1 2 of producing a ripe fruit, independently of all other flowers. Find the probability that in a given year
4. A player undertakes trials, and the probability of success at each trial is p. A turn consists of a sequence of trials up to the first failure. Obtain the probability generating function for the total number of successes in N turns. Show that the mean of this distribution is Np(1 − p)−1 and
3. Three players, Alan, Bob, and Cindy, throw a perfect die in turn independently in the order A, B,C, A, . . . until one wins by throwing a 5 or a 6. Show that the probability generating function F(s) for the random variable X which takes the value r if the game ends on the r th throw can be
2. A symmetrical die is thrown independently seven times. What is the probability that the total number of points obtained is 14? (Oxford 1974M)
1. Let X have probability generating function GX (s) and let un = P(X > n). Show that the generating function U(s) of the sequence u0, u1, . . . satisfies(1 − s)U(s) = 1 − GX (s), whenever the series defining these generating functions converge.
16. I throw two dice and record the scores S1 and S2. Let X be the sum S1+S2 and Y the difference S1 − S2.(a) Suppose the dice are fair, so that the values 1, 2, . . . , 6 are equally likely. Calculate the mean and variance of both X and Y . Find all the values of x and y at which the
Ai occurs for 2 ≤ i ≤ n, prove that E(Un) = (n −1)pq, and find the variance of Un. (Oxford 1977F)
15. Let (Zn : 1 ≤ n < ∞) be a sequence of independent, identically distributed random variables with P(Zn = 0) = q, P(Zn = 1) = p, where p+q = 1. Let Ai be the event that Zi = 0 and Zi−1 = 1. If Un is the number of times
14. Each time you flip a certain coin, heads appears with probability p. Suppose that you flip the coin a random number N of times, where N has the Poisson distribution with parameter λ and is independent of the outcomes of the flips. Find the distributions of the numbers X and Y of resulting
13. In Problem 3.6.12 above, find the expected number of different types of coupon in the first n coupons received.
12. Coupon-collecting problem. There are c different types of coupon, and each coupon obtained is equally likely to be any one of the c types. Let Yi be the additional number of coupons collected, after obtaining i distinct types, before a new type is collected. Show that Yi has the geometric
(b) the expected number of triangles (triples of points each pair of which is joined by an edge) is 1 6 n(n − 1)(n − 2) p3.
(a) the expected number of edges in the random graph is 1 2n(n − 1)p,
Show that the expected number of empty boxes is (M − 1)N /MN−1.11. We are provided with a coin which comes up heads with probability p at each toss. Let v1, v2, . . . , vn be n distinct points on a unit circle. We examine each unordered pair vi , v j in turn and toss the coin; if it comes up
(c) Find the probability that x2 + (U + V)x + U + V = 0 has at least one real root.(Oxford 1980M)10. A number N of balls are thrown at random into M boxes, with multiple occupancy permitted.
(b) Find the expected value of the larger root, given that there is at least one real root.
(a) Find the probability that x2 + Ux + V = 0 has at least one real root.
9. The random variables U and V each take the values ±1. Their joint distribution is given by P(U = +1) = P(U = −1) = 1 2 , P(V = +1 | U = 1) = 1 3 = P(V = −1 | U = −1), P(V = −1 | U = 1) = 2 3 = P(V = +1 | U = −1).
8. Let X1, X2, . . . be independent, identically distributed random variables, and Sn = X1+X2+· · · + Xn. Show that E(Sm/Sn) = m/n if m ≤ n, and E(Sm/Sn) = 1 + (m − n)μE(1/Sn) if m > n, where μ = E(X1). You may assume that all the expectations are finite.
7. Let X1, X2, . . . be discrete random variables, each having mean μ, and let N be a random variable which takes values in the non-negative integers and which is independent of the Xi .By conditioning on the value of N, show that
6. Hugo’s bowl of spaghetti contains n strands. He selects two ends at random and joins them. He does this until no ends are left. What is the expected number of spaghetti hoops in his bowl?
5. Let X and Y be independent discrete random variables, X having the geometric distribution with parameter p and Y having the geometric distribution with parameter r . Show that U = min{X, Y } has the geometric distribution with parameter p + r − pr .
Find the mass functions of Un and Vn, given by Un = min{X1, X2, . . . , Xn}, Vn = max{X1, X2, . . . , Xn}.
4. Let X1, X2, . . . , Xn be independent discrete random variables, each having mass function P(Xi = k) =1 Nfor k = 1, 2, . . . , N.
3. If X and Y are discrete random variables, each taking only two distinct values, prove that X and Y are independent if and only if E(XY ) = E(X)E(Y ).
2. Independent random variables U and V each take the values −1 or 1 only, and P(U = 1) =a, P(V = 1) = b, where 0
1 P(Ai ). This is, however, harder to prove. See the footnote on p. 40.48 Multivariate discrete distributions and independence
where the two terms correspond to whether or not the second queen sits next to the first couple.By (3.39)–(3.41), E(N2) = 2 + n(n − 1) ·2(2n − 3)n(n − 1)2, and hence var(N) = E(N2) − E(N)2 =2(n − 2)n − 1. △Exercise 3.42 Let N be the number of the events A1, A2, . . . , An which
. (3.39)Now 12 Ai = 1Ai , since an indicator function takes only the values 0 and 1, and also 1Ai 1Aj =1Ai∩Aj . Therefore, by symmetry, E(N2) = EX i1Ai + 2 Xi
Example 3.37 The 2n seats around a circular table are numbered clockwise. The guests at dinner form n king/queen pairs. The queens sit at random in the odd-numbered seats, with the kings at random between them. Let N be the number of queens sitting next to their king. Find the mean and variance of
Exercise 3.25 If X and Y are independent discrete random variables, show that the two random variables g(X) and h(Y ) are independent also, for any functions g and h which map R into R.
Show that two events A and B are independent if and only if their indicator functions are independent random variables.
Exercise 3.23 Let X and Y be independent discrete random variables. Prove that P(X ≥ x and Y ≥ y) = P(X ≥ x)P(Y ≥ y)for all x, y ∈ R.Exercise 3.24 The indicator function of an event A is the function 1A defined by 1A(ω) =(1 if ω ∈ A, 0 if ω /∈ A.
It is easy to find a probability space (,F , P), together with two random variables having these distributions. For example, take = {−1, 0, 1},F the set of all subsets of , P given by P(−1) = P(0) = P(1) = 1 3 , and X(ω) = ω, Y(ω) = |ω|. Then X and Y are dependent since P(X = 0, Y = 1)
Showing 1300 - 1400
of 3340
First
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
Last
Step by Step Answers