New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
probability and stochastic modeling
Probability And Stochastic Processes 1st Edition Ionut Florescu - Solutions
7.5 Let (Xn)n≥1 be i.i.d. random variables distributed as normal with mean 2 and variance 1.a) Show that the sequence Yn := 1 nn i=1 XieXi converges almost surely and in distribution and find the limiting distribution.b) Answer the same question for the sequence Zn := X1 + ··· + Xn X2 1 +
7.4 Let X1, X2,... be i.i.d. with mean μ and variance σ2 and E[X4 1] < ∞. Letμc k = E[(X1 − μ)k] denote the kth central moment. Then show that√nX¯ n S2 n−μσ2 D−→ MVN20 0,σ2 μc 3μc 3 μc 4 − σ4where S2 n is the sample variance. Note that the exercise says that
7.3 Write a statement explaining why the Skorohod’s theorem 7.32 on page 230 does not contradict our earlier statement that convergence in distribution does not imply convergence a.s?
7.2 Show that if f : R → R is a continuous function and Xn a.s.−→ X, then f(Xn) a.s.−→ f(X) as well.
7.1 Show that, if Xn Lp−→ X, then E|Xn|p → E|X|p.HINT: The !·!p is a proper norm (recall the properties of a norm).
6.24 Determine the distribution with characteristic functionϕ(t) = t + sin t 2t for every t ∈ R.
6.23 Determine the formula for the distribution with characteristic functionϕ(t) = cost for every t ∈ R.
6.22 Let Xi, 1 ≤ i ≤ 4 be independent identically distributed N(0, 1) random variables. Denote with D =X1 X2 X3 X4, the determinant of the matrix.a) Show that the characteristic function of the random variable X1X2 isϕ(t) = 1√1 + t2b) Calculate the characteristic function of D and state the
6.21 Let Y be a random variable with density f(y) = C e−|y|.Calculate C and show that the characteristic function of Y isϕY (t) = 1 1 + t2 .State the name of the distribution with this characteristic function and densit?
6.20 Use Problem 6.19 to calculate P(X = k) in the case when X has a binomial distribution B(n, p).
6.19 Let X be an r.v. with characteristic function ϕ. Then for every x ∈ R we have P(X = x) = limT→∞1 2T T−T e−itxϕ(t)dt.Deduce that if ϕ(t) → 0 as |t|→∞ then P(X = x)=0 for all x ∈ R.
6.18 Let X, Y be two independent random variables. Denote by ϕX, ϕY their characteristic functions, respectively. Show thatϕXY (t) = EϕX(tY) = EϕY (tX) for every t ∈ R.
6.17 Let (X, Y) be a random vector with joint density f(x, y) = 1 4)1 + xy(x2 − y2)1 1(|x|
6.16 Let X, Y be two independent random variables with uniform law U[−1, 1].a) Calculate the characteristic function of X + Y.b) Identify the density of X + Y.c) Show that the function f(x) = 1π!sin x x"2 is a density.
6.15 Compute the first four moments of a standard normally distributed random variable using the characteristic function.
6.14 Let Y be an uniform random variable on {−1, 1}. Show that its characteristic function isϕY (t) = cos(t)for every t ∈ R
6.13 In this problem we shall derive the characteristic function of a normal using a differential equation approach.a) Let f(x) = e− x2 2 . Show that the function f satisfies ddxf + xf = 0.b) Show that if Z ∼ N(0, 1) then its characteristic function ϕZ satisfies the same equation.c) Deduce
6.11 The gamma function Γ : [0,∞) → [0,∞) is defined asΓ(α) = ∞0 tα−1e−t dt, and is one of the most useful mathematical functions. In statistics it is the basis of the gamma and beta distributions (see the following problems). The gamma function has several interesting properties
6.10 Repeat the problem above with three exponentially distributed random variables, i.e., Xi ∼ Exp(λi), i = 1, 2, 3. As a note the distribution of a sum of n such independent exponentials is called the Erlang distribution and is heavily used in queuing theory.
6.9 Suppose that X1 ∼ Exp(λ1) and X2 ∼ Exp(λ2) are independent. Given that the inverse Laplace transform of the function 1 c+t is e−cx and that the inverse Laplace transform is linear calculate the pdf of X1 + X2.
6.8 Suppose f∗ denote the Laplace transform of the functionf. Prove the following properties:a)L{ x 0f(y)dy}(t) = f∗(t)t b)L{f(x)}(t) = tf∗(t) − f(0+), where f(0+) denote the right limit of f at 0.c)L{f(n)(x)}(t) = t nf∗(t) − t n−1f(0+) − t n−2f(0+) −···− f(n)(0+).
6.7 Prove the following facts about Laplace transform:a)L{e−λx}(t) = 1λ + t b)L{t n−1e−λx}(t) = Γ(n)(λ + t)n , where Γ(n) denote the Gamma function calculated at n.
6.6 Let X ∼ Exp(λ). Calculate EX3 and EX4. Give a general formula for EXn.
6.5 Suppose X has the density f(x) = a xa+1 1(x>1)where a > 0. This distribution is called a Pareto distribution or a power law distribution.a) Show that MX(t) = ∞ for every t ∈ R.b) Show that EXn < ∞if and only if a>n.
6.4 Suppose that X admits a moment generating function MX. Prove that P(X ≥ x) ≤ e−txMX(t) for every t > 0 and P(X ≤ x) ≤ e−txMX(t) for every t < 0.These are known as the Chernoff bounds on the probability.
6.3 Prove Proposition 6.2 on page 182.
6.2 Suppose X has uniform distribution U[a, b] with a
6.1 Suppose that X has a discrete uniform distribution on {1, 2, ..., n}, that is P(X = i) = 1 nfor every i = 1, ..., n.a) Compute the moment generating function of X.b) Deduce that EX = n + 1 2and EX2 = (n + 1)(2n + 1)6 .
5.19 Let X, Y and Z be i.i.d. exponentially distributed random variables with parameter λ. Calculate P(X
5.18 Each child in Romania is equally likely to be a boy or a girl, independent of any other children.a) Suppose we know that a family has n children. Show that the expected number of boys is equal to the expected number of girls. Did you need the assumption of independence?b) Now suppose that, in
5.17 Let the discrete random variables X and Y have density P(X = i, Y = j) = 1 i(i + 1), for i = j ∈ {1, 2,... }, and 0 everywhere else. Show that E[Y] = ∞ while E[Y | X] < ∞.
5.16 Let the vector (X, Y) have the joint distribution f(x, y) = 1 2πσ1σ2#1 − ρ2 e− 1 2(1−ρ2) x−μ1 σ12+ y−μ2 σ22−2ρ (x−μ1)(y−μ2)σ1σ2.(the general bivariate normal density). Find E[Y | X].
5.15 Let X be uniformly distributed on [−1, 1] and let Y = X2. Find E[X | Y] and E[Y | X].
5.14 Let X1, X2,...,X1000 be i.i.d., each taking values 0 or 1 with probability 1 2 .Put Sn = X1 + ··· + Xn. Find E )(S1000 − S300) | 1{S700=400}1 and E[(S1000 −S300)2 | 1{S700=400}].
5.13 Suppose you pick two numbers independently at random from [0, 1]. Given that their sum is in the interval [0, 1], find the probability that X2 + Y2 < 1/4.
5.12 A circular dartboard has a radius of 1 ft. Thom throws three darts at the board until all three are sticking in the board. The locations of the three darts are independent and uniformly distributed on the surface of the board. Let T1, T2, and T3 be the distances from the center to the closest
5.11 Let X, Y, Z be three random variables with joint distribution P(X = k, Y = m, Z = n) = p3qn−3 for integers k, m, n satisfying 1 ≤ k < m < n, where 0
5.10 Let X be a random variable on the probability space (Ω, ℱ , P). Let a set A ∈ ℱ with P(A) = 0 and the sigma algebra generated by the set denoted σ(A).What is E[X | σ(A)]? Let 1A denote the indicator of A. What is E[X | 1A]?Comment: I have been given this question in an industry job
5.9 For each of the following joint distributions of X and Y calculate the density of Y|X and E[Y | X]:a)f(x, y) = λ2e−λ(x+y), x, y > 0 b)f(x, y) = λ2e−λy, y>x> 0 c)f(x, y) = xe−x(y+1), x, y > 0
5.8 Let X have a Beta distribution with parameters a andb. Let Y be distributed as Binomial(n, X). What is E[Y | X]? Describe the distribution of Y and give E[Y]and V[Y]. What is the distribution of Y in the special case when X is uniform?
5.7 For random variables X and Y, show that V(Y) = E[V(Y | X)] + V(E[Y | X]).The variance V(Y | X) is the variance of the random variable Y | X, while the same holds for the random variable E[Y | X].
5.6 Let X, Y be two random variables. Show that for any measurable function ϕfor which the expressions below exist, we have E[ϕ(X)E[Y | X]] = E[Yϕ(X)]
5.5 Using the heorem-Definition 5.10 on page 166, prove the seven properties of the conditional expectation in Proposition 5.12.
5.4 Prove Fubini’s Theorem 5.1 on page 158.
5.2 Let X and Y be independent, both with mean 0. Explain the error in the following derivation:E[X | X − Y = 5] = E[X | X = Y + 5] = E[Y + 5] = 5 5.3 My dad has a car which breaks down all the time. In fact, he knows that the number of days until breaking down is a random variable with density
5.1 An assembly line produces microprocessors each of which is fully functional with probability p. A tester is applied to each such processor. If the microprocessor is faulty, then the test detects that there is a problem with the processor with probability q. If the processor is fully functional,
4.14 A robot arm solders a component on a motherboard. The arm has small tiny errors when locating the correct place on the board. This exercise tries to determine the magnitude of the error, so that we know the physical limitations for the size of the component connections. Let us say that the
4.13 The king of Probabilonia has sentenced a criminal to the following punishment. A box initially contains 999, 999 black balls and one white ball. On the day of sentencing, the criminal draws a ball at random. If the ball is white, the punishment is over and the criminal goes free. If the ball
4.12 A man goes to Atlantic City, and as part of the trip he receives a special 20$coupon. This coupon is special in the sense that only the gains may be withdrawn from the slot machine. So the man decides to play a game of guessing red/black.For simplicity, let us assume that every time he plays
4.11 Ann and Bob each attempt 100 basketball free throws. Ann has probability 0.60 of success on each attempt and Bob has probability 0.50 of success on each attempt. The 200 attempts are independent.What is the approximate numerical probability that Ann and Bob make exactly the same number of free
4.10 Show, using the Cantelli lemma, that when you roll a die the outcome {1}will appear infinitely often. Also show, using the DeMoivre–Laplace theorem, that eventually the average of all rolls up to roll n will be within ε of 3.5, where ε > 0 is any arbitrary real number.
4.9 Urn A and Urn B initially contain four marbles, of which two are white and two are black.A machine simultaneously chooses one marble from each urn (“at random,” and independently for the two urns) and exchanges them. What is the expected number of exchanges until all the white marbles are
4.8 Let ν be the total number of spots which are obtained in 1000 independent rolls of an unbiased die.1. Find E[ν].2. Estimate the probability P (3450
4.7 A box contains three balls, numbered 1, 2, 3. Ann randomly chooses a ball and writes down its number (without returning the ball to the box). Call the number X1. Then Bob randomly chooses a ball and writes down its number, which we’ll call Y1. The drawn balls are returned to the box. This
4.6 Suppose an event A has a probability 0.3. How many independent trials must be performed to assert with probability 0.9 that the relative frequency of A differs from 0.3 by no more than 0.1.
4.5 Prove the properties (i)–(v) of the expectation in Proposition 4.24 on page 144.
4.4 Give an example of two variables X and Y which are uncorrelated but not independent.
4.3 Prove the four assertions in Exercise 2 on page 125.
4.2 Show that any simple function f can be written as i bi1Bi with Bi disjoint sets (i.e., Bi ∩ Bj = ∅, if i = j).
4.1 It is well known that 23 “random” people have a probability of about 1/2 of having at least one shared birthday. There are 365×24×60 = 525,600 min in a year.(We’ll ignore leap days.) Suppose each person is labeled by the minute in which he or she was born, so that there are 525,600
3.8 You want to design an experiment where you simulate bacteria living in a certain medium. To this end, you know that the lifetime of one bacterium is a random variable X (in hours) distributed with exponential density 1 2 e−x/2. However, you also know that all of these peculiar bacteria live
3.7 Design a scheme to generalize the Box–Muller scheme to generate fourdimensional random normals with mean vector 0 and covariance matrix the identity matrix.
3.6 Let X be a random variable with a Beta distribution with parametersa, b.f(x) = Γ(a + b)Γ(a)Γ(b)xa−1(1 − x)b−1, where 0
3.5 The pdf of a logistic random variable with parameter λ > 0 is f(x) = λe−λx(1 + e−λx)2 , for x > 0a) Calculate the cdf of the distribution.b) Calculate the inverse cdf F−1.c) Write a code to generate random numbers with the logistic distribution.d) Calculate the expectation EX = ∞0
3.4 Consider the random variable X with the density f(x) = C cos(x) sin(x), x ∈ [0, π/2]a) Calculate the constant C which makes above a probability density.b) Sketch this density.c) Implement the importance sampling method to generate random numbers from the density. Create a histogram by
3.3 Consider the following normal mixture density:f(x)=0.7 1√2π9 e− (x−2)2 18 + 0.3 1√2π4 e− (x+1)
3.2 Look at the polar rejection method. Show that the two variables given by this algorithm are independent.
3.1 Look at the Box–Muller and the two resulting variables X and Y. Calculate the joint and marginal distributions of these variables and show that they are independent.
2.29 Depending on the weather conditions, the probability that an egg hatches is a random variable P distributed according to a Beta distribution with parameters a andb. A hen deposits 20 eggs and it is reasonable to assume that the total number of eggs that hatch is a random variable X, which has
2.28 Let X be a random variable with a normal distribution with parameters μand σ. Show that Y = eX has a log-normal distribution.
2.26 Let X1 and X2 be independent, unit exponential random variables (so the common density is f(x) = e−x,x> 0). Define Y1 = X1 −X2 and Y2 = X1/(X1 −X2). Find the joint density of Y1 and Y2.2.27 Let Y be a LogN(0,1) random variable, that is, fY (y) = 1 y√2πe− (log y)2 2, y> 0.Show that X
2.25 You have two opponents A and B with whom you alternately play games.Whenever you play A, you win with probability pA; whenever you play B, you win with probability pB, where pB > pA. If your objective is to minimize the number of games you need to play to win two in a row, should you start
2.24 We generate a point in the plane according to the following algorithm:Step 1: Generate R2 which is ξ2 2 (chi squared with 2 degrees of freedom)Step 2: Generate θ according to the uniform distribution on the interval (0, 2π)Step 3: Let the coordinates of the point be(X, Y)=(R cos θ, R sin
2.23 Let X, Y be independent N(0, 1).a) Calculate the distribution of X X+Y . This is called the Cauchy distribution.b) Find the distribution of X|Y |.
2.22 Let X be a unit exponential random variable (with density f(x) = e−x,x>0) and let Y be an independent U[0, 1] random variable. Find the density of T = Y/X.
2.21 Find a density function f(x, y) such that if (X, Y) has density f then X2 + Y2 is uniformly distributed on (0,10).
2.20 All children in Bulgaria are given IQ tests at ages 8 and 16. Let X be the IQ score at age 8 and let Y be the IQ score for a randomly chosen Bulgarian 16-year-old.The joint distribution of X and Y can be described as follows. X is normal with mean 100 and standard deviation 15. Given that X =
2.19 A bacterial solution contains two types of antiviral cell: type A and type B.Let X denote the lifetime of cell type A and Y denote the lifetime of cell type B.Whenever they are grown in the same culture, the death of one type will cause the death of the other type. Therefore, it is not
2.18 Let (X, Y) have the joint density f(x, y). Let U = aX + b and V = cY + d, where the constantsa, b,c, d are fixed and a > 0, c > 0. Show that the joint density of U and V is fU,V (u, v) = 1 ac f!u − b a , v − d c".
2.17 Let X and Y be independent, N(0, 1) random variables.a) Verify that X2 is distributed as a χ2 1 random variable.b) Find P(X2 < 1).c) Find the distribution of X2 + Y2.d) Find P(X2 + Y2 < 1).
2.16 Give a proof of Lemma 2.23 on page 76.
2.15 The random variable whose probability density function is given by f(x) = 1 2λeλx , if x ≤ 0 12λe−λx , if x > 0, is said to have a Laplace, sometimes called a double exponential, distribution.a) Verify that the density above defines a proper probability distribution.b) Find the
2.14 Let X1, X2,...,Xn be independent U(0, 1) random variables. Let M =max1≤i≤n Xi. Calculate the distribution function of M.
2.13 Two friends decide to meet at the Castle Gate of Stevens Institute. They each arrive at that spot at some random time between a and a + T. They each wait for 15 min, and then leave if the other does not appear. What is the probability that they meet?
2.12 Choose a point A at random in the interval [0, 1]. Let L1 (respectively, L2)be the length of the bigger (respectively, smaller) segment determined by A on [0, 1].Calculatea) P (L1 ≤ x) for x ∈ R.b) P (L2 ≤ x) for x ∈ R.
2.11 Every morning, John leaves for work between 7:00 and 7:30. The trip always takes between 40 and 50 min. Let X denote the time of departure and let Y denote the travel time. Assume that both variables are independent and both are uniformly distributed on the respective intervals. Find the
2.10 Suppose we toss a fair coin repeatedly. Let X denote the number of trials to get the first head and let Y the number needed to get two heads in repeated tosses.Are the two variables independent?
2.9 The random variables X and Y have the joint distribution X123 12 1 12 16 112 Y 13 1 3 0 0 44 1 91 91 9a) Calculate the marginal distributions of X and Y.b) Show that the random variables X and Y are dependent.c) Find two random variables U and V which have the same marginal distributions as X
2.8 A density function is defined as f(x, y) = K(x + 2y) if 0
2.7 We know that the random variables X and Y have joint density f(x, y). Assume that P(Y = 0) = 0. Find the densities of the following variables:a) X + Yb) X − Yc) XYd) X Y .
2.6 What is the probability that two randomly chosen numbers between 0 and 1 will have a sum no greater than 1 and a product no greater than 15 64 ?
2.5 A random variable X has distribution function F(x) = a + b arctan x2 , −∞
2.4 Buffon’s needle problem.Suppose that a needle is tossed at random onto a plane ruled with parallel lines a distance L apart, where by a “needle” we mean a line segment of length l ≤ L.What is the probability of the needle intersecting one of the parallel lines?Hint: Consider the angle
2.3 Give an example of two distinct random variables with the same distribution function.
2.2 Show that any piecewise constant function is Borel measurable. (See the description of piecewise constant functions in Definition 2.9.)
2.1 Prove Proposition 2.6. That is, prove that the function F in Definition 2.5 is increasing, right continuous, and taking values in the interval [0, 1], using only Proposition 1.20 on page 21.
1.23 Ali Baba is caught by the sultan while stealing his daughter. The sultan is being gentle with him and he offers Ali Baba a chance to regain his liberty.There are two urns and m white balls and n black balls. Ali Baba has to put the balls in the two urns; however, he likes the only condition
1.22 My friend Andrei has designed a system to win at the roulette. He likes to bet on red, but he waits until there have been six previous black spins and only then he bets on red. He reasons that the chance of winning is quite large since the probability of seven consecutive back spins is quite
1.21 Andre Agassi and Pete Sampras decide to play a number of games together.They play nonstop and at the end it turns out that Sampras won n games while Agassi m, where n>m. Assume that in fact any possible sequence of games was possible to reach this result. Let Pn,m denote the probability that
1.20 At the end of a well-known course, the final grade is decided with the help of an oral examination. There are a total of m possible subjects listed on some pieces of paper. Of them, n are generally considered “easy.
Showing 3600 - 3700
of 6914
First
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
Last
Step by Step Answers