All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Ask a Question
Search
Search
Sign In
Register
study help
business
introduction to probability statistics
Questions and Answers of
Introduction To Probability Statistics
Linearity of Expectation: For two discrete random variables X and Y, show that E[X + Y] = EX + EY.
Let X and Y be two independent Geometric(p) random variables. Also let Z = X −Y. Find the PMF of Z.
Consider the set of points in the set C:Suppose that we pick a point (X, Y) from this set completely at random. Thus, each point has a probability of 1/11 of being chosen.a. Find the joint and
Let X = aY + b. Then E[X|Y = y] = E[aY + b|Y = y] = ay + b. Here, we have g(y) = ay + b, and therefore, E[X|Y ] = aY + b, which is a function of the random variable Y.
Consider the set of points in the set C: C = {(x, y)|x, y ∈ Z,x2 +|y| ≤ 2}. Suppose that we pick a point (X,Y ) from this set completely at random. Thus, each point has a probability of 1/11 of
The number of cars being repaired at a small repair shop has the following PMF:Each car that is being repaired is a four-door car with probability 3/4 and a two door car with probability 1/4,
Let X and Y be two independent random variables with PMFsDefine Z = X − Y. Find the PMF of Z. Px (k)= Py(k)= 1 0 for x = 1,2,3,4,5 otherwise
Let X and Y be two random variables and g and h be two functions. Show that E[g(X)h(Y)|X] = g(X)E[h(Y)|X].
Consider two random variables X and Y with joint PMF given in Table 5.5Define the random variable Z as Z = E[X|Y ].a. Find the Marginal PMFs of X and Y .b. Find the conditional PMF of X, given Y = 0
Let X, Y, and Z = E[X|Y] be as in Problem 13. Define the random variable V as V = Var(X|Y).a. Find the PMF of V.b. Find EV.c. Check that V ar(X) = EV +V ar(Z).Problem 13Consider two random variables
Let N be the number of phone calls made by the customers of a phone company in a given hour. Suppose that N ∼ Poisson(β), where β > 0 is known. Let Xi be the length of the i'th phone call, for i
Let X and Y be two jointly continuous random variables with joint PDFa. Find the constant c.b. Find P(0 ≤ X ≤ 1, 0 ≤ Y ≤ 1/2).c. Find P(0 ≤ X ≤ 1). fxy (x, y) = 1/1
In Example 5.15 find the marginal PDFs fX(x) and fY(y).Example 5.15Let X and Y be two jointly continuous random variables with joint PDF fxy (x, y) = x + cy² 0 0≤x≤ 1,0 ≤ y ≤ 1 otherwise
Let X and Y be two jointly continuous random variables with joint PDFa. Find the marginal PDFs, fX(x) and fY (y).b. Write an integral to compute P(0 ≤ Y ≤ 1, 1 ≤ X ≤ √e). -{ fxy(x, y)
Let X and Y be two jointly continuous random variables with joint PDFa. Find RXY and show it in the x−y plane.b. Find the constant c.c. Find marginal PDFs, fX(x) and fY(y).d. Find P(Y ≤ x/2).e.
Let X and Y be two jointly continuous random variables with joint PDFa. Find the marginal PDFs, fX(x) and fY(y).b. Find P(X > 0,Y c. Find P(X > 0 or Y d. Find P(X > 0|Y e. Find P(X + Y >
Let X and Y be two jointly continuous random variables with joint CDFa. Find the joint PDF, fXY(x, y).b. Find P(X c. Are X and Y independent? Fxy (x, y) = 1 0 +e-(x+2y) x, y > 0 otherwise
Find the joint C DF for X and Y in Example 5.15Example 5.15Let X and Y be two jointly continuous random variables with joint PDF fxy (x, y) = x + cy² 0 0≤x≤ 1,0 ≤ y ≤ 1 otherwise
Let X ∼ N(0, 1).a. Find the conditional PDF and CDF of X given X > 0.b. Find E[X|X > 0].c. Find Var(X|X > 0).
Let X ∼ Exponential(1).a. Find the conditional PDF and CDF of X given X > 1.b. Find E[X|X > 1].c. Find Var(X|X > 1).
Let X and Y be two jointly continuous random variables with joint PDFFor 0 ≤ y ≤ 1, find the following:a. The conditional PDF of X given Y = y.b. P(X > 0|Y = y). Does this value depend on y?c.
Let X and Y be two jointly continuous random variables with joint PDFFor 0 ≤ y ≤ 2, finda. The conditional PDF of X given Y = y;b. P(X fxy (x, y) = + + xy 0≤x≤ 1,0 ≤ y ≤2 otherwise
Let X and Y be two jointly continuous random variables with joint PDFFind E[Y|X = 0] and Var(Y|X = 0). fxy (x, y) = 2 (²x² + y 2 0 -1 ≤ x ≤ 1,0 ≤ y ≤ 1 otherwise
Let X and Y be two independent Uniform(0, 1) random variables. Find P(X3 + Y > 1).
Suppose X ∼ Exponential(1) and given X = x, Y is a uniform random variable in [0,x] , i.e., Y |X = x ∼ Uniform(0, x), or equivalently Y |X ∼ Uniform(0,X).a. Find EY.b. Find Var(Y).
Consider the unit disc D = {(x, y)|x2 +y2 ≤ 1}. Suppose that we choose a point (X, Y) uniformly at random in D. That is, the joint PDF of X and Y is given bya. Find the constant c.b. Find the
Let X and Y be two independent Uniform(0, 2) random variables. Find P(XY < 1).
Determine whe ther X and Y are independent: a. fxy(x, y) = b. fxy(x, y) = 2e-2-2y 0 8xy x,y > 0 otherwise 0 < x < y < 1 otherwise
Consider the setSuppose that we choose a point (X,Y ) uniformly at random in E. That is, the joint PDF of X and Y is given bya. Find the constant c.b. Find the marginal PDFs fX(x) and fY(y).c. Find
Let X and Y be as in Example 5.21. Find E[X|Y = 1] and Var(X|Y = 1).Example 5.21Let X and Y be two jointly continuous random variables with joint PDFFor 0 ≤ y ≤ 2, find. fxy (x, y)
Let X and Y be two jointly continuous random variables with joint PDFa. Find the constant c.b. Find P(0 ≤ X ≤ 1/2, 0 ≤ Y ≤ 1/2). fxy (x, y) = x + cy² 0 0≤x≤ 1,0 ≤ y ≤ 1 otherwise
Let X and Y be two independent Uniform(0, 1) random variables. Finda. E[XY]b. E[eX+Y]c. E[X2 + Y2 + XY]d. E[YeXY]
Suppose X ∼ Uniform(1, 2) and given X = x, Y is an exponential random variable with parameter λ = x, so we can write Y |X = x ∼ Exponential(x).We sometimes write this as Y |X ∼
Let X and Y be two independent Uniform(0, 1) random variables, and Z = X/Y. Find the CDF and PDF of Z.
Let X and Y be two jointly continuous random variables with joint PDFFind E[XY2]. fxy (x, y) = x+y 0 0 ≤ x, y ≤ 1 otherwise
Let X and Y be two independent N(0, 1) random variables, and U = X + Y.a. Find the conditional PDF of U given X = x, fU|X(u|x).b. Find the PDF of U, fU (u).c. Find the conditional PDF of X given U =
Let X and Y be two independent Uniform(0, 1) random variables, and Z = XY. Find the CDF and PDF of Z.
Let X and Y be two independent standard normal random variables. Let alsoFind fZW(z, w). Z = 2X-Y W = -X+Y
Consider two random variables X and Y with joint PMF given in Table 5.6.Find Cov(X, Y) and ρ(X, Y). Table 5.6: Joint PMF of X and Y in Problem 31 X=0 X = 1 Y=0 6 -00 Y = 1 Y = 2 1 8 6
Let X and Y be two independent standard normal random variables, and let Z = X +Y. Find the PDF of Z.
Let X and Y be two independent N(0, 1) random variable andFind Cov(Z,W). Z=11-X+X²Y, W = 3-Y.
Suppose X ∼ Uniform(1, 2), and given X = x, Y is exponential with parameter λ = x. Find Cov(X, Y).
Let X and Y be jointly normal random variables with parameters μX = 1, σ2X = 4, μY = 1, σ2Y = 1, and ρ = 0.a. Find P(X +2Y > 4).b. Find E[X2Y2].
Let Z1 and Z2 be two independent N(0, 1) random variables. Definewhere ρ is a real number in (−1, 1).a. Show that X and Y are bivariate normal.b. Find the joint PDF of X and Y.c. Find ρ(X, Y). X
Let X and Y be jointly normal random variables with parameters μX = −1, σ2X = 4, μY = 1, σ2Y = 1, and ρ = 1/2.a. Find P(X + 2Y ≤ 3).b. Find Cov(X −Y, X + 2Y).
Let X ∼ N(0, 1) and W ∼ Bernoulli (1/2) be independent random variables. Define the random variable Y as a function of X and W:Find the PDF of Y and X +Y. Y=h(X, W) = X 1-x if W = 0 if W = 1
Let X and Y be two independent N(0, 1) random variable andFind ρ(Z,W). Z = 7+X+Y, W = 1+ Y.
Let X ∼ Uniform(1, 3) and Y |X ∼ Exponential(X). Find Cov(X, Y).
Let X and Y be two independent N(0, 1) random variables andFind Cov(Z,W). Z = 1+X+XY2, W = 1 + X.
Let X and Y be two random variables. Suppose that σ2X = 4, and σ2Y = 9. If we know that the two random variables Z = 2X −Y and W = X + Y are independent, find Cov(X, Y) and ρ(X, Y).
Let X1, X2,⋯,Xn be i.i.d. random variables, where Xi ∼ Bernoulli(p). Define1. E[Y],2. Var(Y). If Y = Y₁ + Y₂ + + Yn. find Y₁ = X₁X₂, Y2 = X2 X3, E Yn-1 = Xn-1Xn, Yn = XnX₁.
Let X and Y be jointly normal random variables with parameters μX = 2, σ2X = 4, μY = 1, σ2Y = 9, and ρ = −1/2.a. Find E[Y |X = 3].b. Find V ar(Y |X = 2).c. Find P(X +2Y ≤ 5|X +Y = 3).
Let X and Y be jointly normal random variables with parameters μX, σ2X, μY, σ2Y, and ρ. Find the conditional distribution of Y given X = x.
Let X ∼ Exponential(λ). Find the MGF of X, MX(s), and all of its moments, E[Xk].
In this problem, our goal is to find the variance of the hypergeometric distribution. Let's remember the random experiment behind the hypergeometric distribution. You have a bag that contains b blue
Let X ∼ Poisson(λ). Find the MGF of X, MX(s).
If X ∼ Geometric(p), find the MGF of X.
For a random variable X, we know thatFind the distribution of X. Mx(s) 2 2-s for s € (-2,2).
Iffind EX and Var(X). 1 Mx(s) = =+ =/e³ + 1e²s
Using MGF s show that if X ∼ N(μX, σ2X) and Y ∼ N(μY, σ2Y) are independent, then X + Y ~ N] μχ + μY, on tôi
Using MGFs prove that if X ∼ Binomial(m, p) and Y ∼ Binomial(n, p) are independent, then X + Y ∼ Binomial(m + n, p).
Let X be a continuous random variable with the following PDFFind the MGF of X, MX(s). fx(x) = +) == 12e-²1|²2|1 e-A/²).
If X ∼ Exponential (λ), show that ox(w) = T X -jw
Let X, Y, and Z be three independent N(1, 1) random variables. Find E[XY |Y +Z = 1].
For each of the following random variables, find the MGF.a. X is a discrete random variable, with PMFb. Y is a Uniform(0, 1) random variable. Px(k) = 3 k = 1 k = 2
Suppose that X, Y, and Z are three independent random variables. If X, Y ∼ N(0, 1) and Z ∼ Exponential(1), find1. E[XY |Z = 1],2. E[X2Y2Z2|Z = 1].
Let X, Y and Z be three jointly continuous random variables with joint PDF1. Find the joint PDF of X and Y.2. Find the marginal PDF of X.3. Find the conditional PDF of fXY |Z(x, y|z) using4. Are X
Let X, Y and Z be three jointly continuous random variables with joint PDF1. Find the constant c.2. Find the marginal PDF of X. fxyz(x, y, z) = c(x + 2y + 3z) 0 0≤x, y, z ≤ 1 otherwise
Let X and Y be jointly normal random variables with parametersa. Find P(2X + Y ≤ 3).b. Find Cov(X + Y, 2X −Y).c. Find P(Y > 1|X = 2). x = 1, o=1, y = 0, o = 4, and p py =
Remember that a continuous random variable X is said to have a Gamma distribution with parameters α > 0 and λ > 0, shown as X ∼ Gamma(α, λ), if its PDF is given byIf X ∼ Gamma(α, λ),
Using the M GFs show that if Y = X1 +X2 + ⋯ + Xn, where the Xi's are independent Exponential(λ) random variables, then Y ∼ Gamma(n,λ).
For a random v ector X, show CX = RX −EXEXT.
A sensor net work consists of n sensors that are distributed randomly on the unit square. Each node's location is uniform over the unit square and is independent of the locations of the other node. A
Let Bn be the event that a graph randomly generated according to G(n, p) model has at least one isolated node. Show thatAnd conclude that for any P(Bn) ≤n(1-p)n-1.
A system consists of 4 components in a series, so the system works properly if all of the components are functional. In other words, the system fails if and only if at least one of its components
Letbe a normal random vector with the following mean and covarianceFind the MGF of X defined as X= X1 X₂ X3.
Let X and Y be two jointly normal random variables with X ∼ N(μX,σX), Y ∼ N(μY, σY), and ρ(X, Y) = ρ. Show that the above PDF formula for PDF of [X Y] is the same as fX,Y (x, y) given in
Letbe a normal random vector with the following mean and covariance matricesLet also1. Find P(X2 > 0).2. Find expected value vector of Y, mY = EY.3. Find the covariance matrix of Y, CY.4. Find
Let X be an n-dimensional random vector. Let A be a fixed (non-random) invertible n by n matrix, and b be a fixed n-dimensional vector. Define the random vector Y as Y = AX +b.Find the PDF of Y in
Let X ∼ Uniform (0, 1). Suppose that given X = x, Y and Z are independent and Y |X = x ∼ Uniform(0,x) and Z|X = x ∼ Uniform(0, 2x). Define the random vector U as1. Find the PDFs of Y and Z.2.
Let X be an n-dimensional random vector and the random vector Y be defined aswhere A is a fixed m by n matrix and b is a fixed m-dimensional vector. Show that Y = AX +b,
Let X ∼ Geometric (p). Using Markov's inequality find an upper bound for P(X ≥ a), for a positive integer a. Compare the upper bound with the real value of P(X ≥ a).
A bank teller serves customers standing in the queue one by one. Suppose that the service time Xi for customer i has mean EXi = 2 (minutes) and Var(Xi) = 1. We assume that service times for different
The number of accidents in a certain city is modeled by a Poisson random variable with an average rate of 10 accidents per day. Suppose that the number of accidents on different days are independent.
In a communication system each data packet consists of 1000 bits. Due to the noise, each bit may be received in error with probability 0.1. It is assumed bit errors occur independently. Find the
Consider the following random experiment: A fair coin is tossed repeatedly forever. Here, the sample space S consists of all possible sequences of heads and tails. We define the sequence of random
In a communication system, each codeword consists of 1000 bits. Due to the noise, each bit may be received in error with probability 0.1. It is assumed bit errors occur independently. Since error
If X1, X2, X3, ⋯ is a sequence of i.i.d. random variables with CDF FX(x), then Xn d→ X. This is because Fx₂(x) = Fx(x), for all x.
The amount of time needed for a certain machine to process a job is a random variable with mean EXi = 10 minutes and Var(Xi) = 2 minutes2. The times needed for different jobs are independent from
You have a fair coin. You toss the coin n times. Let X be the portion of times that you observe heads. How large n has to be so that you are 95% sure that 0.45 ≤ X ≤ 0.55? In other words, how
Let X2, X3, X4, ⋯ be a sequence of random variable such thatShow that Xn converges in distribution to Exponential(1). Fxn (x) = nx (1-(1-4) ⁰² 0 x > 0 otherwise
An engineer is measuring a quantity q. It is assumed that there is a random error in each measurement, so the engineer will take n measurements and reports the average of the measurements as the
Let X1, X2, X3, ⋯ be a sequence of random variable such thatwhere λ > 0 is a constant. Show that Xn converges in distribution to Poisson(λ). (MA). n Xn Binomialn, for n E N, n>d,
Let X2, X3, X4, ⋯ be a sequence of random variables such thatShow that Xn converges in distribution to X = 1. Fx₁, (x) = en(x-1) 1+en (2-1) 0 x>0 otherwise
Let X2, X3, X4, ⋯ be a sequence of random variables such thatShow that Xn converges in distribution to Uniform(0, 1). Fxn(x) = = enx + xen n+. enz + (¹+¹ ) en n enx ten - (+¹²) en enx
Let Xn ∼ Exponential(n), show that Xn ρ → 0. That is, the sequence X1, X2, X3, ⋯ converges in probability to the zero random variable X.
Let X be a random variable, and Xn = X +Yn, wherewhere σ > 0 is a constant. Show that Xn ρ→ X. EYn n 9 Var (Y) = n 9
Consider a sequence {Xn,n = 1, 2, 3,⋯} such thatShow that Xn = n 0 with probability n2 with probability 1 1 n²
Let Xn ∼ Uniform(0, 1/n). Show that Xn Lr→ 0, for any r ≥ 1.
Consider a sequence {Xn,n = 1, 2, 3,⋯} such thatShow thata. Xn p→ 0.b. Xn does not converge in the rth mean for any r ≥ 1. Xn = n² 0 with probability with probability 1 - n
Showing 1300 - 1400
of 1482
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15