New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
elementary probability for applications
Probability An Introduction 2nd Edition Geoffrey Grimmett, Dominic Welsh - Solutions
(i) ebtM1(at ),
(b) Let X1 and X2 be independent random variables with moment generating functions M1(t ) and M2(t ). Find random variables with the following moment generating functions:
of X, and hence find E(X3).
24. (a) Let X be an exponential random variable with parameter λ. Find the moment generating
Find the moment generating function of Xn.Let Y1, Y2, . . . , Yn be independent random variables, each having the same distribution as X1.Find the moment generating function of Pn i=1 Yi , and deduce its distribution. (Oxford 2005)
23. Define the moment generating function of a random variable X.If X and Y are independent random variables with moment generating functions MX (t ) and MY (t ), respectively, find the moment generating function of X + Y .For n = 1, 2, . . . , let Xn have probability density function fn(x) =1(n
(b) There are N horses that compete in m races. The results of different races are independent.The probability of horse i winning any given race is pi ≥ 0, with p1+p2+· · ·+pN = 1.Let Q be the probability that the same horse wins all m races. Express Q as a polynomial of degree m in the
22. (a) Suppose that f (x) = xm, where m is a positive integer, and X is a random variable taking values x1, x2, . . . , xN ≥ 0 with equal probabilities, and where the sum x1 + x2 + · · · + xN = 1. Deduce from Jensen’s inequality that XN i=1 f (xi ) ≥ N f (1/N).
where an and bn are arbitrary real numbers.Suppose that φ(t ) = e−|t |α, where 0 < α ≤ 2. Determine an and bn such that Yn has the same distribution as X1 for n = 1, 2, . . . . Find the probability density functions of X1 when α = 1 and when α = 2. (Oxford 1980F)
21. Let X1, X2, . . . , Xn be independent random variables, each with characteristic function φ(t).Obtain the characteristic function of Yn = an + bn(X1 + X2 + · · · + Xn),
Obtain the moment generating function of X and hence show that if R is another independent random variable with P(R = r ) = (e − 1)e−r for r = 1, 2, . . . , then R − X is exponentially distributed. (Oxford 1981F)
Let X = max{U1,U2, . . . ,UN }, where the Ui are independent random variables uniformly distributed on (0, 1) and N is an independent random variable whose distribution is given by P(N = k) =1(e − 1)k!for k = 1, 2, . . . .
20. Let X be a random variable whose moment generating function M(t ) exists for |t | < h, where h > 0. Let N be a random variable taking positive integer values such that P(N = k) > 0 for k = 1, 2, . . . .Show that M(t ) =∞X k=1 P(N = k)E(et X | N = k) for |t | < h.
19. Show that φ(t ) = exp(−|t |α) can be the characteristic function of a distribution with finite variance if and only if α = 2.
Show that An has the Cauchy distribution regardless of the value of n.
18. Let X1, X2, . . . be independent random variables each having the Cauchy distribution, and let An =1 n(X1 + X2 + · · · + Xn).
17. Find the characteristic function of a random variable with density function f (x) = 1 2 e−|x| for x ∈ R.132 Moments, and moment generating functions
16. Show that X and −X have the same distribution if and only if φX is a purely real-valued function.
15. Prove that if φ1 and φ2 are characteristic functions, then so is φ = αφ1 + (1 − α)φ2 for anyα ∈ R satisfying 0 ≤ α ≤ 1.
14. Coupon-collecting problem. There are c different types of coupon, and each coupon obtained is equally likely to be any one of the c types. Find the moment generating function of the total number N of coupons which you must collect in order to obtain a complete set.
(a) Show that M(t)M(−t) is the moment generating function of X − Y, where Y is independent of X but has the same distribution.(b) In a similar way, describe random variables which have moment generating functions 12 − M(t ), Z∞0 M(ut )e−u du.
13. Let X have moment generating function M(t).
Deduce the joint moment generating function of a pair of random variables having the bivariate normal distribution (6.76) with parameters μ1,μ2, σ1, σ2, ρ.* 12. Let X and Y be independent random variables, each having mean 0, variance 1, and finite moment generating function M(t ). If X + Y
11. The joint moment generating function of two random variables X and Y is defined to be the function M(s, t ) of two real variables defined by M(s, t) = E(esX+tY )for all values of s and t for which this expectation exists. Show that the jointmoment generating function of a pair of random
The winner of a game scores 2 points, the loser none; if a game is drawn, each player scores 1 point. Let X and Y be the number of points scored by A and B, respectively, in a series of n games. Prove that cov(X, Y ) = −2npq. (Oxford 1982M)
Two players A and B play a series of independent games. The probability that A wins any particular game is p2, that B wins is q2, and that the game is a draw is 2pq, where p+q = 1.
10. Prove that if X = X1 + · · · + Xn and Y = Y1 + · · · + Yn, where Xi and Y j are independent whenever i 6= j , then cov(X, Y ) =Pn i=1 cov(Xi , Yi ). (Assume that all series involved are absolutely convergent.)
7.7 Problems 131 are uncorrelated and have unit variance. Show that this will be the case if and only if AV A′ = I, and show that A can be chosen to satisfy this equation if and only if V is non-singular. (Any standard results from matrix theory may, if clearly stated, be used without proof. A′
Calculate the variance of the random variable Z =XN n=1 an Xn, and deduce that the symmetric matrix V = (vmn ) is non-negative definite. It is desired to find an N × N matrix A such that the random variables Yn =XN r=1
9. Random variables X1, X2, . . . , XN have zero expectations, and E(Xm Xn) = vmn for m, n = 1, 2, . . . , N.
8. Let X1, X2, . . . be independent, identically distributed random variables and let N be a random variable which takes values in the positive integers and is independent of the Xi . Find the moment generating function of S = X1 + X2 + · · · + XN in terms of the moment generating functions of N
Deduce that, if X1, X2, . . . , Xn are independent random variables having the normal distribution with mean 0 and variance 1, then Z = X2 1 + X2 2 + · · · + X2 nhas the χ2 distribution with n degrees of freedom. Hence or otherwise show that the sum of two independent random variables, having
7. Show from the result of Problem 7.7.5 that the χ2 distribution with n degrees of freedom has moment generating function M(t ) = (1 − 2t )−1 2 n if t < 1 2 .
6. Let X1, X2, . . . , Xn be independent random variables with the exponential distribution, parameterλ. Show that X1 + X2 + · · · + Xn has the gamma distribution with parameters n andλ.
5. Let X and Y be independent random variables, X having the gamma distribution with parameters s and λ, and Y having the gamma distribution with parameters t and λ. Use moment generating functions to show that X +Y has the gamma distribution with parameters s +t andλ.
4. Show that every distribution function has only a countable set of points of discontinuity.
This fact is of importance in statistics and is used when estimating the population variance from knowledge of a random sample.130 Moments, and moment generating functions 3. Let X1, X2, . . . be identically distributed, independent random variables and let Sn = X1 + X2 + · · · + Xn. Show that
2. Let X1, X2, . . . be uncorrelated random variables, each having mean μ and variance σ 2. If X = n−1(X1 + X2 + · · · + Xn), show that E 1 n − 1 Xn i=1(Xi − X)2 = σ 2.
1. Let X and Y be random variables with equal variance. Show that U = X −Y and V = X +Y are uncorrelated. Give an example to show that U and V need not be independent even if, further, X and Y are independent.
since the integrand in the latter integral is the density function of the normal distribution with mean t and variance 1, and thus has integral 1. The moment generating function MX (t) exists for all t ∈ R. △Example 7.45 If X has the exponential distribution with parameter λ, then MX (t)
If X has the normal distribution with mean 0 and variance 1, then MX (t) =Z∞−∞et x 1√2πe−1 2 x2 dx= e 12 t 2 Z∞−∞1√2πe−1 2 (x−t)2 dx= e 12 t 2, (7.44)
σ 2. If Sn = X1 + X2 + · · · + Xn, show that cov(Sm, Sn) = var(Sm) = mσ 2 if m < n.Exercise 7.37 Show that cov(X, Y) = 1 in the case when X and Y have joint density function f (x, y) =1 ye−y−x/y if x, y > 0, 0 otherwise.
Exercise 7.36 Let X1, X2, . . . be a sequence of uncorrelated random variables, each having variance
Exercise 7.35 If X and Y have the bivariate normal distribution with parameters μ1, μ2, σ1, σ2, ρ (see(6.76)), show that cov(X, Y) = ρσ1σ2 and ρ(X, Y ) = ρ.
If ρ(X, Y ) = 0, we say that X and Y are uncorrelated.
(b) Y is a linear increasing function of X if and only if ρ(X, Y ) = 1,(•c) Y is a linear decreasing function of X if and only if ρ(X, Y ) = −1.
(a) if X and Y are independent, then ρ(X, Y ) = 0,
Exercise 7.12 If X has the χ2 distribution with n degrees of freedom, show that E(Xk ) = 2k Ŵ(k + 1 2 n)Ŵ( 1 2n)for k = 1, 2,
Exercise 7.11 If X has the gamma distribution with parameters w and λ, show that E(Xk ) =Ŵ(w + k)λkŴ(w)for k = 1, 2, . . . .
Exercise 7.10 If X is uniformly distributed on (a, b), show that E(Xk ) =bk+1 − ak+1(b − a)(k + 1)for k = 1, 2, . . . .
(b) f has finite moments of all orders,(c) fa and f have equal moments of all orders, in that Z∞−∞xk f (x) dx =Z∞−∞xk fa(x) dx for k = 1, 2, . . . .Thus, { fa : −1 ≤ a ≤ 1} is a collection of distinct density functions having the same moments.△
(a) fa is a density function,
(d) Given that the two roots R1, R2 of the above quadratic are real, what is the probability that both |R1| ≤ 1 and |R2| ≤ 1?(Cambridge 2012)If X has the Cauchy distribution, then E(Xk ) = Z ∞−∞xk π(1 + x2)dx
(c) What is the probability that the random quadratic equation x2 + 2V x + U = 0 has real roots?
(b) Let U, V be independent random variables, each with the uniform distribution on [0, 1].Show that P(V2 > U > x) = 1 3 − x + 2 3 x3/2 for x ∈ (0, 1).
28. (a) Define the distribution function F of a random variable, and also its density function f , assuming F is differentiable. Show that f (x) = −d dx P(X > x).
(iii) Let X and Y be independent normal random variables, each with mean 0, and with non-zero variances a2 and b2, respectively. Show that V = Y/X has probability density function fV (v) =cπ(c2 + v2)for −∞ < v < ∞, where c = b/a. Hence find P(|Y | < |X|)(•b) Now let X and Y be independent
27. (a) Suppose that the continuous random variables X and Y are independent with probability density functions f and g, both of which are symmetric about zero.(i) Find the joint probability density function of (U, V ), where U = X and V = Y/X.(ii) Show that the marginal density function of V is fV
Show that the joint probability density function of U = 1 2 (X − Y) and V = Y is fU,V (u, v) =(1 2 e−u−v if (u, v) ∈ A, 0 otherwise, where A is a region of the (u, v) plane to be determined. Deduce that U has probability density function fU(u) = 1 2 e−|u|, −∞ < u < ∞.(Oxford 2008)
Find the density of X + 1 2Y . Is it the same as the density of max{X, Y}? (Cambridge 2007)25. Let X and Y have the bivariate normal density function f (x, y) =1 2πp 1 − ρ2 exp−1 2(1 − ρ2)(x2 − 2ρxy + y2)for x, y ∈ R, for fixed ρ ∈ (−1, 1). Let Z = (Y − ρX)/p 1 − ρ2. Show
Let X and Y be independent and exponentially distributed random variables, each with density f (x) = λe−λx for x ≥ 0.
24. Let X and Y be independent non-negative random variables with densities f and g, respectively.Find the joint density function of U = X and V = X + aY, where a is a positive constant.
Deduce that the probability that all three crew members can keep in touch is (π + 2)/(4π).* 23. Zog continued. This time, n members of DrWho’s crew are transported to Zog, their positions being independent and uniformly distributed on the surface. In addition, DrWho is required to choose a
(b) Find the probability that Z is in direct radio communication with both X and Y, conditional on the event that φ > 1 2π.
* 22. Three crew members of Dr Who’s spacecraft Tardis are teleported to the surface of the spherical planet Zog. Their positions X, Y , Z are independent and uniformly distributed on the surface. Find the probability density function of the angle[XCY, where C is the centre of Zog.Two people
21. Let X and Y be independent random variables, each uniformly distributed on [0, 1]. Let U =min{U, V} and V = max{U, V}. Show that E(U) = 1 3 , and hence find the covariance of U and V . (Cambridge 2007)
Find the probability density function fX (x) of X, and find also E(X). Find the conditional probability density function fY |X (y | x) of Y given that X = x, and find also E(Y | X = x).(Oxford 2005)
20. Let X and Y be random variables with the vector (X, Y) uniformly distributed on the region R = {(x, y) : 0 < y < x < 1}. Write down the joint probability density function of (X, Y).Find P(X + Y < 1).
(b) Deduce that the following random variables are independent:X1/r1 X2/r2 and X3/r3(X1 + X2)/(r1 +r2).(Oxford 1982F)
(a) Show that Y1 = X1/X2 and Y2 = X1 + X2 are independent and that Y2 is a χ2 random variable with r1 +r2 degrees of freedom.
19. Let X1, X2, X3 be independent χ2 random variables with r1, r2, r3 degrees of freedom.
Prove that U and V are independent, and find their distributions.Deduce that EX X + Y=E(X)E(X) + E(Y).(Oxford 1971F)
18. Leta, b > 0. Independent positive random variables X and Y have probability densities 1Ŵ(a)xa−1e−x , 1Ŵ(b)yb−1e−y , for x, y ≥ 0, respectively, and U and V are defined by U = X + Y, V =X X + Y.
17. In a sequence of dependent Bernoulli trials, the conditional probability of success at the i th trial, given that all preceding trials have resulted in failure, is pi (i = 1, 2, . . . ). Give an expression in terms of the pi for the probability that the first success occurs at the nth
6.9 Problems 105
15. Let X and Y be independent random variables, X having the normal distribution with mean 0 and variance 1, and Y having the χ2 distribution with n degrees of freedom. Show that T =X√Y/n has density function f (t ) =1√πnŴ( 1 2 (n + 1))Ŵ( 1 2n)1 +t2 n!−1 2 (n+1)for t ∈ R.T is said to
(c) Find the mean of V/U, where U = max{|X|, |Y|} and V = min{|X|, |Y|}.(Oxford 1985M)
(b) Find the mean of Z = X2/(X2 + Y2).
(a) Show that W = 2X − Y is normally distributed, and find its mean and variance.
14. The independent random variables X and Y are normally distributed with mean 0 and variance 1.
(a) Find the (cumulative) distribution and density functions of the random variables 1−e−λX , min{X, Y }, and X − Y .(b) Find the probability that max{X, Y } ≤ aX, where a is a real constant.(Oxford 1982M)
(b) 0 < tan−1(Y/X) < α and Y > 0.(Consider various cases depending on the relative sizes ofa, b, and c.) (Oxford 1981M)13. The independent random variables X and Y are both exponentially distributed with parameterλ, that is, each has density function f (t ) =(λe−λt if t > 0, 0 otherwise.
12. X and Y are independent random variables normally distributed with mean zero and varianceσ 2. Find the expectation of pX2 + Y 2. Find the probabilities of the following events, wherea,b, c, and α are positive constants such that b < c and α < 1 2π:(a)p X2 + Y 2 < a,
11. An aeroplane drops medical supplies to two duellists. With respect to Cartesian coordinates whose origin is at the target point, both the x and y coordinates of the landing point of the supplies have normal distributions which are independent. These two distributions have the same mean 0 and
9. Let X and Y have joint density function f (x, y) =(1 4 (x + 3y)e−(x+y) if x, y ≥ 0, 0 otherwise.Find the marginal density function of Y. Show that P(Y > X) = 5 8 .10. Let Sn be the sum of n independent, identically distributed random variables having the exponential distribution with
8. Show that there exists a constant c such that the function f (x, y) =c(1 + x2 + y2)3/2 for x, y ∈ R is a joint density function. Show that both marginal density functions of f are the density function of the Cauchy distribution.
7. Let X1, X2, . . . be independent, identically distributed, continuous random variables. Define N as the index such that X1 ≥ X2 ≥ · · · ≥ XN−1 and XN−1 < XN .Prove that P(N = k) = (k − 1)/k! and that E(N) = e.
6. Let X1, X2, . . . , Xn be independent random variables, each having distribution function F and density function f . Find the distribution function of U and the density functions of U and V , where U = min{X1, X2, . . . , Xn} and V = max{X1, X2, . . . , Xn}. Show that the joint density function
5. Lack-of-memory property. If X has the exponential distribution, show that P????X > u + v X > u= P(X > v) for u, v > 0.This is called the ‘lack of memory’ property, since it says that, if we are given that X > u, then the distribution of X − u is the same as the original distribution of X.
4. Show that if X and Y are independent random variables having the exponential distribution with parameters λ and μ, respectively, then min{X, Y } has the exponential distribution with parameter λ + μ.
3. Let (X, Y, Z) be a point chosen uniformly at random in the unit cube (0, 1)3. Find the probability that the quadratic equation Xt 2 + Y t + Z = 0 has two distinct real roots.
1 if x + y ≥ 0, 0 otherwise, the joint distribution function of some pair of random variables? Justify your answer.6.9 Problems 103
1. If X and Y are independent random variables with density functions fX and fY , respectively, show that U = XY and V = X/Y have density functions fU (u) =Z∞−∞fX (x) fY (u/x)1|x|dx, fV (v) =Z∞−∞fX (vy) fY (y)|y| dy.2. Is the function G, defined by G(x, y) =(
Exercise 6.80 Let the pair (X, Y) have the bivariate normal distribution of (6.76), and leta, b ∈ R.Show that aX + bY has a univariate normal distribution, possibly wth zero variance.
Exercise 6.79 Let the pair (X, Y ) have the bivariate normal density function of (6.76), and let U and V be given by (6.78). Show that U and V have the standard bivariate normal distribution. Hence or otherwise show that E(XY ) − E(X)E(Y ) = ρσ1σ2, and that E(Y | X = x) = μ2 + ρσ2(x −
Show that the joint density function of U = 1 2 (X − Y ) and V = Y is fU,V (u, v) =(1 2 e−u−v if (u, v) ∈ A, 0 otherwise, where A is a region of the (u, v) plane to be determined. Deduce that U has the bilateral exponential distribution with density function fU(u) = 1 2 e−|u| for u ∈ R.
6.6 Conditional density functions 95 Exercise 6.55 Let X and Y be random variables with joint density function f (x, y) =(1 4 e−1 2 (x+y) if x, y > 0, 0 otherwise.
Exercise 6.54 Let X and Y be independent random variables, each having the normal distribution with mean μ and variance σ 2. Find the joint density function of U = X − Y and V = X + Y. Are U and V independent?
Exercise 6.55 Let X and Y be random variables with joint density function f (x, y) =(1 4 e−1 2 (x+y) if x, y > 0, 0 otherwise.Show that the joint density function of U = 1 2 (X − Y ) and V = Y is fU,V (u, v) =(1 2 e−u−v if (u, v) ∈ A, 0 otherwise, where A is a region of the (u, v) plane
6.6 Conditional density functions 95
Showing 1200 - 1300
of 3340
First
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
Last
Step by Step Answers