New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
probability and stochastic modeling
Probability And Stochastic Processes 1st Edition Ionut Florescu - Solutions
12.10 Consider the gene mutating Example 12.8 on page 378. Show that the process forms a Markov chain, and write its probability transition matrix. Classify the states in the chain as recurrent or transient. Does the chain have a stationary distribution? Is it unique?
12.9 Consider the gambler’s ruin problem where probability of winning is p = 1/2 and the initial wealth is $100. Suppose the gambler stops if he reaches $200 or if he goes bankrupt. Let Xn the total wealth after n games. Show that Xn is a Markov chain, and write the probability transition matrix
12.8 Consider a Markov chain with the following transition probability matrix:P =⎛⎜⎜⎜⎜⎝0 0 1/2 1/2 0 0 1/2 1/2 1/2 1/2 0 0 1/2 1/2 0 0⎞⎟⎟⎟⎟⎠Show that the chain is ergodic but not irreducible.
12.7 Using the matrix exemplified in Subsection 12.6.2, calculate the following:f68, f76, E[Xτ = 4|X0 = 6], f81, f74.
12.6 Let Xn be a birth and catastrophy chain. Its transition matrix is P =⎛⎜⎜⎜⎜⎝q0 p0 0 0 ...q1 0 p1 0 ...q2 0 0 p2 ...:⎞⎟⎟⎟⎟⎠Let X0 = 0. Assume pk > 0 ∀k. Define Tb = inf{n : Xn = b} as the time of first entry inb. Calculate E0[Tb] = E[Tb|X0 = 0].
12.5 Give an example of a Markov chain with more than one stationary distribution.Draw its associated graph.
12.4 Draw the “arrows diagram” of a communicating class with periodd, for d =2, 3, 4, 5. Generalize.
12.3 Show that the Remark 12.4 is true.
12.2 Give the proof of the Markov property (Theorem 12.2 on page 372).
12.1 For each example in Section 12.1.2, write down the transition probability matrix.
11.17 A population begins with a single individual. In each generation, each individual dies with probability 1/2 or divides into two identical individuals with probability 1/2. Let N(n) denote the number of individuals in the population in the nth generation. Find the mean and variance of N(n).
11.16 Suppose a branching process starts with one individual. Suppose each individual has exactly three children each of whom survives until reproductive age with
11.15 Suppose two independent renewal processes are denoted with N1(t) and N2(t). Assume that each inter-arrival time for both processes has the same distribution F(x) and density f(x) exists.a) Is the process N1(t) + N2(t) a renewal process? Explain.b) Let Y(t) denote the time until the first
11.14 Let N(t) be a renewal process and let m(t) = E[N(t)] be the renewal function. Suppose that the inter-arrival times are strictly positive, that is, P(X ≤ 0) =FX(0) = 0.a) For all x > 0 and t>x, show that E[N(t) | X1 = x] = E[N(t − x)] + 1b) Use parta) to derive the renewal equation for .
11.13 A particular driver commutes between Hoboken (H) and Atlantic City (AC).Every time the driver goes from H to AC, he drives using cruise control at a fixed speed, which is uniformly distributed between 55 and 65 miles per hour (mph). Every time he drives back from AC to H, he drives at a speed
11.12 Suppose a light bulb in a renewal process has a lifetime distributed with density f(x) = 3(x+1)4 , for x ≥ 0. Let Y(t) be the residual lifetime for some large t. Use the results in this chapter to calculate the expected value and variance for the residual life Y(t).
11.11 Suppose I want to cross a road with no stopping sign. On that road, cars pass according to a Poisson process with rate λ. I need at least a τ second gap to cross the road safely. If I observe a gap of at least τ seconds, I start crossing immediately. Let T the amount of time I need to wait
11.10 Consider an insurance company which receives claims according to a Poisson process with rate λ = 400/year. Suppose the size of claims are random variables Rn which are distributed with an exponential distribution with mean $1000. Calculate the expected total amount of the claims during a
11.9 Suppose a high-performance computer (HPC) processes jobs which arrive according to a Poisson process with rate λ = 6 per day. We assume that all jobs require some time to complete and this time is uniformly distributed between 2 and 3 h. The HPC will process the jobs in the order they are
11.8 A certain component of a machine has a lifetime distribution F1(x). When this component fails, there is a probability p that it is instantaneously replaced (0
11.7 Suppose we have a type of light bulb with lifetime uniformly distributed on the interval (0, 10) days. You walk into the room which has this type of light bulb after 1 year. We know that the maintenance replaced light bulbs the moment they burned out, and the light bulbs are distributed
11.6 Denote by L(t) = XN(t)+1 the random variable measuring the current lifetime at t. Suppose the renewal process has discrete lifetimes. Let n ≥ 1, and set z(n) = P(Lt = n). Show that z(n) satisfies the renewal equation (11.5) with bn =f(t), for n = 0, 1, 2,...,t − 1 0, for n ≥ t.
11.5 Let X be an exponential random variable with mean 1. For any λ > 0, show that the random variable X/λ is an exponential random variable with rate λ.
11.4 Let Xt, the lifetimes for a renewal process, be i.i.d. Geometric(p). Give the distribution of the age of the current item in use at t: A(t), the residual lifetime of the item: Y(t), and the total age of the item in use at t: XN(t)+1.
11.3 Circle all statements that are true. N(t) is a renewal process and Sn is the time of the nth renewal:a) N(t) < n if and only if Sn > t,
11.2 Show that for a renewal process SN(t)−1 ≤ t ≤ SN(t), as long as N(t) ≥ 1.
11.1 For a branching process, what is the probability that X(t)=0 eventually?(Population dies out). Show the following:For m < 1, P(Population dies out)=1.If m = 1, then P(Population dies out)=1 except when the number of offspring is exactly 1.If m > 1? P(Population dies out) > 0 Note that these
10.19 Simulate the following inventory problem. Assume that for a year you administer a maritime platform – oil facility off the cost of Nigeria. You have oil tankers arriving according to a Poisson process with rate 0.3/day. Each oil tanker has a storage capacity that varies depending on the
10.18 Using a software package, simulate a one-dimensional Poisson process with rate 2 events/min. Using your simulation, estimate the following probabilities:P{N[2,4] = 4}, P{S3 ∈ [3, 5]}, where N[2,4] denotes the number of events in the time interval [2, 4] minutes, and S3 is the time of the
10.17 Using a software package, simulate a Poisson process on the plane suitable for the previous two problems. Use λ = 2. With the help of this simulation, answer the following questions:a) Estimate the probability that the circle of radius 1 centered at the origin of the plane contains two
10.16 In the same setting as in the previous problem, let Ri denote the distance from the origin to the ith closest event to it. Prove that Yi = πR2 i − πR2 i−1, with i ≥ 1 being independent random variables exponentially distributed with mean 1/λ.
10.15 A two-dimensional Poisson process is a general Poisson process with ???? =R2, similar to the process presented in Example 10.3. More specifically, for any region of the plane A, the number of events in A, N(A), is a Poisson random variable with mean λ|A|, where |A| denotes the area of the
10.14 Using a software package, simulate a Poisson process with rate 2 events/min.Using your simulation, estimate the probabilities P{N[2,4] = 4}P{S3 ∈ [3, 5]}, where N[2,4] denotes the number of events in the time interval [2, 4] minutes, and S3 is the time of the third event.Calculate what
10.13 Let Xt, t ≥ 0, be a Poisson process of rate λ. Let T0 = 0, and let Ti be the time of the ith observation (jump of X), if i ≥ 1. Let N = inf{k ≥ 1 : Tk − Tk−1 >1}. Find ETN , EN, and E(TN |N = 8).
10.12 Suppose two individuals A and B both require heart transplant. The remaining time to live if they do not receive transplants are exponentially distributed with mean μA and μB, respectively. New hearts become available through a transplant program according to a Poisson process with rate λ.
10.11 A gas station has only one pump. The time to fill the tank is distributed as Exp(λ). However, after filling the tank, the customers pay by walking inside the gas station while leaving the car at the pump thus blocking all other cars from filling their tank. The service time at the counter is
10.10 Suppose that during the thunderstorm in the previous problem we monitor the device using a computer connected to a power source. Suppose the rate is λ = 3 per hour. Each time lightning hits, it would damage the computer, so we created a device which will protect the computer. However, the
10.9 A device recording lightning intensity is attached to a pole during a thunderstorm. Assume that during that time the bolts of lightning occur at that pole according to a Poisson process with rate λ. However, each bolt registered by the machine renders the device inoperative for a fixed length
10.8 A critical component of the next space shuttle to Europa (the Jupiter satellite)has an operating lifetime that is exponentially distributed with mean 1 year (ship time). As soon as a component fails, it is replaced by a new one having statistically identical properties. It is known that the
10.7 Buses arrive to a certain stop according to a Poisson process with rate λ. If you take the bus from that stop, then it takes a time S measured from the time you enter the bus to reach home. If you walk from that bus stop, then it takes a time T to reach home. Suppose that the rule you decide
10.6 Suppose that buses arrive at a particular stop according to a Poisson process with rate 6 per hour. I start waiting for the buss to arrive at 10:00.a) What is the probability that no bus arrives in the next 20 min?b) How many buses are expected to arrive in the next 90 min?c) What is the
10.5 Suppose vehicles arrive at an intersection according to a Poisson Process with rate λ = 10 per minute.a) Let Ti denote the time between vehicle i − 1 and vehicle i. What is the distribution of T1?b) Find the probability that the time between vehicles 5 and 6, T6, is less than 30 s.c) Given
10.4 Prove Corollary 10.7 on page 315.
10.3 Let Nt be a Poisson process with rate λ. Let Sn be the time of the nth event.Finda) E[S4],b) E[S4 | N1 = 2],c) E[N4 − N2 | N1 = 3].
10.2 Let Nt be a Poisson process. Let Sn be the time of the nth event. Show by double inclusion that{ω : Sn(ω) ≤ t} = {ω : Nt(ω) ≥ n}.Recall that ω is a path for stochastic processes.
10.1 Prove Exercise 8 on page 309. To this end, use Stirling’s approximation n! ≈ √2πn nne−n,
9.14 Combine problems 9.11 and 9.13 together to conclude that Ln log2 n → 1 a.s.Therefore the length of the maximum sequence of Heads is approximately equal to log2 n, when n, the number of tosses, is large enough.
9.12, followed by ε ↓ 0, to conclude that lim inf n→∞Ln log2 n≥ 1 a.s.
9.13 Apply the first Borel–Cantelli lemma for the events An defined in problem
9.12 Fix ε > 0. Let An = {Ln < kn} for kn = (1 − ε) log2 n. Explain why An ⊆mn i=1 Bc i , where mn = [n/kn] (integer part) and Bi = {X(i−1)kn+1 = ... = Xikn = 1} are independent events.Deduce that P(An) ≤ P(Bc i )mn ≤ exp(−nε/(2 log2 n)), for all n large enough.
9.11 Apply the first Borel–Cantelli lemma 1.28 to the events An = {ln > (1 + ε) log2 n}.Conclude that for each ε > 0, with probability 1, ln ≤ (1 + ε) log2 n for all n large enough.Take a countable sequence εk ↓ 0, and then conclude that lim sup n→∞Ln log2 n≤ 1, a.s.
9.10 Explain why P(lm = i)=2−(i+1), for i = 0, 1, 2,... and any m.
9.9 Let Sn = n i=1 Xi, where Xi’s are i.i.d. Uniform(0, 1) random variables.Define the stochastic process Nt = sup{n ∈ N | Sn ≤ n}.
9.8 Let Xi be distributed as a normal with mean μi and variance σ2, where μ1
9.7 Let Xi be distributed as a normal with mean μi and variance σ2, where μ1
9.6 Prove part 9) of Proposition 9.9.
9.5 Prove parts 6) and 7) of Proposition 9.9 by applying the Central Limit Theorem.
9.4 Show the equality of sets in part 5) of Proposition 9.9 by double inclusion.
9.3 Give a general proof for parts 3) and 4) in Proposition 9.9 for any n, k ∈ N.
9.2 Using the notation in Section 9.2, give the distribution of N3 and the joint distribution of (N2, N3) when p = 1 3 . Furthermore, calculate E(N3) and E(Nn) for some arbitrary n.
9.1 Prove that the Xi’s in Proposition 9.9 are in fact independent.
8.23 A pulsar is a highly magnetized, rotating neutron star that emits a beam of electromagnetic radiation. Such stars are extremely useful because they can be observed by any civilization and they are distinct enough in the space to provide spatial coordinates by referring the relative distances
8.22 Let X1,...,Xn be a random sample from a normal distribution N(μ, σ2)where both parameters are unknown. Obtain the likelihood ratio for testing H0 : σ2 = σ2 0, Ha : σ2 = σ2 0.
8.21 Let X1,...,Xn be a random sample from a normal distribution with unknown variance σ2. Obtain the likelihood ratio for testing H0 : μ = μ0, Ha : μ = μ0.Hint: Note that in this case the parameter space Θ = {(μ, σ2) | μ ∈ R, σ2 > 0} is two dimensional, while Θ0 = {(μ, σ2) | μ =
8.20 Let X1,...,Xn be a random sample from a normal distribution with known variance σ2. Obtain the likelihood ratio for testing H0 : μ = μ0, Ha : μ = μ0.
8.19 Let X1,...,Xn be a random sample from a Poisson distribution with parameter denoted with θ. Obtain the likelihood ratio for testing H0 : θ = θ0, Ha : θ = θ0.
8.18 Refer to the previous problem. Suppose λ = 1. Calculate the MLE in 20 repeated simulations and estimate its variance. Compare with the variance obtained by using Fisher information.
8.17 Calculate the MLE for the parameter λ if X1,...,X30 are i.i.d. double exponentials, i.e., they have the density f(x|λ) = 1 2λe−λx, if x ≥ 0 12λeλx, if x < 0 .
8.16 Suppose Xi are independent Poisson(λi), for 1 ≤ i ≤ n. Calculate λˆ the MLE of λ. What is the limiting distribution of λˆ?
8.15 Find the MLE for λ using the observations Xi independent Poisson(λρi)where ρi are known and 1 ≤ i ≤ n.
8.14 Two independent readers proofread this manuscript which contains N ≥ 0 errors with N unknowns. Reader A found 50 errors, while reader B found 77 errors.Thirty-six of those errors were found by both readers. What is the MLE for N, the total number of errors in the manuscript?
8.13 Let X1,...,Xm distributed as N(μ, σ2 1) and Y1,...,Yn distributed as N(μ, σ2 2), and all random variables are independent. Calculate the MLE of(μ, σ2 1, σ2 2).
8.12 Let X1,...,Xn be i.i.d. with density f(x|θ1, θ2) = Ce−θ1x, if x ≥ 0 Ceθ2x, if x < 0 .a) Calculate the constant C so that the pdf is a proper pdf.b) Calculate the MLEs of θ1 and θ2.
8.11 Let X1,...,Xn be i.i.d. U(0, θ). Calculate the MLE of θ and show that n(ˆθ − θ) D−→ Exp(θ).
8.10 Let X1,...,Xn be i.i.d. N(μ, σ2) where we know that μ is an integer.Calculate the MLE of μ.
8.9 Let X1,...,Xn be i.i.d. N(μ, 1), where we know that μ ≥ 0. Calculate μˆ the MLE of the parameter μ. Show that the limiting distribution of the MLE is P(√nμˆ ≤ x) −→ 0 if x < 0 x−∞ √1 2π e− x2 2 if x > 0 .What should the limiting distribution be at x = 0 to be a proper
8.8 Let X1,...,Xn be exponential with parameter λ. Calculate the MLE of λ.
8.7 In this problem we refer to Example 8.6. Suppose the random variables(X1, Y1),(X2, Y2),...,(Xn, Yn) are all independent and that for each i we have Xi, Yi distributed as N(μi, σ2).a) Set ν = 1σ2 and express the likelihood function in terms of the parametersμ1,...,μn, and ν.b) Show that
8.6 Suppose that for a sample x1,...,xn coming from a distribution with density f(x|θ) we can construct the maximum likelihood function L(θ), with θ ∈ R. Suppose further that we can calculate ˆθ the MLE, that is, L(ˆθ) ≥ L(θ), for all θ ∈ R. Let h : R → Λ, where Λ denotes the
8.5 In Formulas (8.1) we gave the point of extreme for the likelihood of a normal sample. Show that the point is indeed a maximum point, by calculating the Hessian matrix and showing that it is negative definite.
8.4 Suppose that in a trial we obtain the following numbers representing a random sample of 10 bricks selected of a production line:x1 = 6.161520, x2 = 4.892259, x3 = 6.342859, x4 = 4.673576, x5 = 4.936554, x6 = 3.966134, x7 = 4.351286, x8 = 6.010079, x9 = 4.944755, x10 = 5.691312.We assume that
8.3 Suppose we are given a sequence of i.i.d. random variables X1,...,Xn distributed as Bernoulli(p) random variables.a) Calculate pˆn the MLE for p based on these n random variables.b) Calculate the limiting distribution of pˆn.
8.2 Let X1,...,Xn a random sample from an N(μ, σ2) distribution.a) Show that the random variable X¯ − μσ/√n is distributed as a N(0, 1).b) Show that the random variable 1σ2 S2 = 1σ2 1n − 1n i=1(Xi − X¯ )2 has the same distribution as 1 n−1χ2 n−1, where χ2 n−1 is a
8.1 Suppose Z1, Z2,...,Zn is a random sample from a Cauchy distribution with parameters 0 and 1 (denoted Cauchy(0, 1)):f(x|(0, 1)) = 1π(1 + x2).Show that n 1=1 Zi has a Cauchy distribution with parameters 0 and n and also that the sample average Z¯ = 1 nn 1=1 Zi is Cauchy(0, 1). For completion,
7.22 In an opinion poll, it is assumed that an unknown proportion p of people are in favor of a proposed new law and a proportion 1 − p are against it. A sample of n people is taken to estimate p. The sample proportion pˆ of people in favor of the law is taken as an estimate of p. Using the
7.21 A fair coin is flipped 400 times. Determine the number x such that the probability that the number of heads is between 200 − x and 200 + x is approximately .80.
7.20 Let X ∼ N(100, 15). Find four numbers x1, x2, x3, x4 such that P(X
7.19 Let X, X1, ....., Xn, ... be i.i.d. with P(X = 2k) = 1 2k for k = 1, 2, .....a) Show that EX = ∞.b) Prove that
7.18 Let Xn be i.i.d. random variables with EX1 = 0 and assume that Xn are bounded. That is, there exists a C > 0 such that|X1| ≤ C a.s.Show that for every ε > 0, P!Sn n≥ ε"≤ 2 exp !(−nε2 2c2 )".Deduce that Sn/n converges in probability to zero.
7.17 Let (Xn)n≥1 be i.i.d. random variables. Suppose that P(X1 = 0) < 1.a) Show that there exists α > 0 such that P(|X1| ≥ α) > 0.b) Show that P(lim sup n{|Xn| ≥ α} = 1.Derive that P(lim sup n Xn = 0) = 0.
7.16 Let {Xi}i≥1 be i.i.d. random variables. Assume that the sums Sn = n i=1 Xi have the property Sn/n → 0 almost surely as n → ∞. Show that E[|X1|] < ∞ and therefore E[X1]=0.
7.15 Suppose X1, X2,... are Poisson(λ). Find the limit distribution of of eX¯ and1 − 1 nnX¯. Also, find the limiting distribution of 1 X¯+X¯ 2+X¯ 3
7.13 Let Xn, n ≥ 1 be i.i.d. random variables with some distribution with finite mean μ and variance σ2. For every n ≥ 1, we denote Zn := n i=1 eXi 1 n.Find the almost sure limit of the sequence {Zn}n≥1.7.14 Let {Xn}n≥1 be a sequence of i.i.d. random variables with common distribution
7.12 Let X¯ and S denote the sample mean and standard deviation of the random variables X1, X2,... . Find the limiting distribution of X¯S and S¯X in each of the following cases:1. The random variables are uniform on (0, 1), 2. The random variables are exponential with parameter λ, i.e., their
7.11 Suppose Xi are i.i.d. Exp(1). Find the limiting distribution of√n(X¯ 2 − X(1) − 1), where X(1) is the first-order statistics.
7.10 Consider {Yn}n≥1 a sequence of independent identically distributed random variables with common distribution U([0, 1]) (uniform on the interval). For every n ≥ 1, we define Yn(1) = min(Y1,...,Yn)and Yn(n) = max(Y1,...,Yn).the first and last order statistics of the sequence, respectively.
7.9 Refer to Example 7.7. Let Ui be independent identically distributed random variables uniformly distributed on the interval [0, 1], i = 1,...,n. Define U(n) =max{U1,...,Un}. Show that n(1 − U(n)) D−→ Exp(1),
7.8 Suppose X1, X2,... and Y1, Y2,... are two sequences of random variables, each sequence i.i.d., and suppose that the random variables X1 and Y1 are correlated with correlation coefficient ρ, while Xi and Yj are independent if i = j. With the usual notations X¯ n, Y¯n, denote Sn(X) = 81 nn
7.7 Consider a sequence of i.i.d. r.v.’s Bernoulli distributed on {−1, 1}. That is, P(X1 = 1) = p and P(X1 = −1) = 1 − p.Denote Sn = X1 + .... + Xn.
7.6 Let X1, ..., Xn be independent random variables with the same distribution given by P(Xi = 0) = P(Xi = 2) = 1 4and P(Xi = 1) = 1 2.Let Sn = X1 + ... + Xn.a) Find E(Sn) and Var(Sn).b) Give a necessary and sufficient condition on n ∈ N to have P!1 2 ≤Sn n≤3 2"≥ 0.999.
Showing 3500 - 3600
of 6914
First
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
Last
Step by Step Answers