New Semester Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
probability statistics
Stochastics Introduction To Probability And Statistics 2nd Edition Hansotto Georgii - Solutions
S Discrete uniform distribution model. A lottery drum contains N lots labelled with the numbers 1; 2; : : : ; N . Little Bill, who is curious about the total number N of lots, uses an unwatched moment to take a lot at random, read off its number, and return it to the drum. He repeats this n
Shifted uniform distributions. Consider the product model .Rn;Bn; U#˝n W # 2 R/, where U# is the uniform distribution on the interval OE# 1 2; #C 1 2. Show thatare unbiased estimators of #. Hint: Use the symmetry of U# for # D 0. M - IM: i=1 X; and T = max X+min Xi) Iin 1in
Forest mushrooms are examined in order to determine their radiation burden. For this purpose, n independent samples are taken, and, for each, the number of decays in a time unit is registered by means of a Geiger counter. Set up a suitable statistical model and find an unbiased estimator of the
S Ergodic theorem for Markov chains in continuous time. In the situation of Problem 6.28, suppose that E is finite and G irreducible, in that for all x; y 2 E there is a k 2 N and x0; : : : ; xk 2 E such that x0 D x, xk D y and Qk iD1 G.xi1; xi / ¤ 0. Show the following:(a) For all x; y 2 E there
Explosion in finite time. Without the assumption (iii) in Problem 6.28, a Markov chain with generator G does not exist in general. For instance, suppose E D N and G.x; xC1/ D G.x; x/ D x2 for all x 2 N. Determine the discrete jump chain .Zn/n0 with starting point 0 and jump times .T n /n1 as
S Markov chains in continuous time. Let E be countable and G D .G.x; y//x;y2E a matrix satisfying the properties(i) G.x; y/ 0 for x ¤ y,(ii) a.x/ WD G.x; x/ (iii) a WD supx2E a.x/ We construct a Markov process .Xt /t0 that ‘jumps with rate G.x; y/ from x to y’ by using the stochastic
Busy period of a queue viewed as a branching process. Recall Example (6.32) and the random variables Xn and Zn defined there. Interpret the queue as a population model, by interpreting the customers newly arriving at time n as the children of the customer waiting at the front of the queue; a
Excursions from a recurrent state. Consider a Markov chain with a countable state space E and transition matrix … that starts in a recurrent state x 2 E. Let T0 D 0 and, for k 1, let Tk D inf¹n > Tk1 W Xn D xº be the time of the kth return to x and Lk D Tk Tk1 the length of the kth
Generalise Theorem (6.30) as follows. For x; y 2 E, let F1.x; y/ D Px.y < 1/ be the probability that y can eventually be reached from x, Ny D Pn1 1¹XnDyº the number of visits to y (from time 1 onwards), F1.x; y/ D Px.Ny D 1/ the probability for infinitely many visits, and G.x; y/ D ıxyCEx.Ny/
A migration model. Consider the following simple model of an animal population in an open habitat. Each animal living there leaves the habitat, independently of all others, with probability 1 p, and it stays with probability p. At the same time, a Poisson number (with parameter a > 0) of animals
S Birth-and-death processes. Let … be a stochastic matrix on E D ZC. Suppose that….x; y/ > 0 if and only if either x 1 and jx yj D 1, or x D 0 and y 1. Find a necessary and sufficient condition on … under which … has a stationary distribution ˛. If ˛exists, express it in terms of
Extinction or unlimited growth of a population. Consider a Galton–Watson process.Xn/n0 with supercritical offspring distribution , i.e., suppose E. / > 1. Show that all states k ¤ 0 are transient, and that PkXn ! 0 or Xn !1for n!1D 1 :
S Irreducible classes. Let E be countable, … a stochastic matrix on E, and Erec the set of all recurrent states. Let us say a state y is accessible from x, written as x ! y, if there exists some k 0 such that …k.x; y/ > 0. Show the following:(a) The relation ‘!’ is an equivalence relation
Let 0 < p < 1 and consider the stochastic matrix … on E D ZC defined by….x; y/ D Bx;p.¹yº/ ; x;y 2 ZC :Find …n for arbitrary n 1. Can you imagine a possible application of this model?
(a) Random walk on a finite graph. Let E be a finite set and a symmetric relation on E. Here, E is interpreted as the vertex set of a graph, and the relation x y means that x and y are connected by an (undirected) edge. Suppose that each vertex is connected by an edge to at least one other
S Time reversal for renewals. In addition to the age process .Xn/ in Example (6.29), consider the process Yn D min¹Tk n W k 1; Tk nº that indicates the remaining life span of the appliance used at time n.(a) Show that .Yn/n0 is also a Markov chain, find its transition matrix Q…and
A cycle condition for reversibility. Under the assumptions of the ergodic theorem (6.13), show that … has a reversible distribution if and only if the probability of running through a cycle does not depend on the direction, in that….x0; x1/….x1; x2/ : : : ….xn1; x0/ D ….x0;
A variant of Pólya’s urn model. Consider again an urn containing no more thanN > 2 balls in the colours white and red, but at least one ball of each colour. If there are less than N balls, one of them is chosen at random and returned together with a further ball of the same colour (taken from an
Random replacement II. As in the previous problem, consider an urn holding at most N balls, but now they come in two colours, either white or red. If the urn is non-empty, a ball is picked at random and is or is not replaced according to the outcome of the flip of a fair coin.If the urn is empty,
Random replacement I. Consider an urn containing initially N balls. Let Xn be the number of balls in the urn after performing the following procedure n times. If the urn is non-empty, one of the balls is removed at random; by flipping a fair coin, it is then decided whether or not the ball is
Let E be finite and … a stochastic matrix on E. Show that … satisfies the assumptions of the ergodic theorem (6.13) if and only if … is irreducible and aperiodic in the sense that for one (and thus all) x 2 E the greatest common divisor of the set ¹k 1 W …k.x; x/ > 0º is 1.
S Branching process with migration and annihilation. Consider the following modification of the Galton–Watson process. Given N 2 N, assume that at each site n 2 ¹1; : : : ; N ºthere is a certain number of ‘particles’ that behave independently of each other as follows.During a time unit, a
Total size of a non-surviving family tree. Consider a Galton–Watson process .Xn/n0 with offspring distribution , and suppose that E. / 1 and X0 D 1. Let T D Pn0 Xn be the total number of descendants of the progenitor. (Note thatT < 1 almost surely.) Show that the generating function 'T of T
Find the extinction probability for a Galton–Watson process with offspring distribution in the cases(a) .k/ D 0 for all k > 2,(b) .k/ D bak1 for all k 1 and a; b 2 0; 1OE with b 1 a. (According to empirical studies by Lotka in the 1930s, for a D 0:5893 and b D 0:2126 this describes the
S The asymmetric ruin problem. A well-known dexterity game consists of a ball in a‘maze’ of N concentric rings (numbered from the centre to the outside) that have an opening to the next ring on alternating sides. The aim of the game is to get the ball to the centre(‘ring no. 0’) by suitably
Let .Xn/n0 be a Markov chain with transition matrix … on a countable set E, and suppose that Px.y < 1/ D 1 for all x; y 2 E. Let h W E ! OE0;1OE be harmonic, in that…h D h. Show that h must be constant.
S The Wright–Fisher model in population genetics. Consider a particular gene with the two alleles A and a in a population of constant size N. Suppose for simplicity that the individuals have a haploid set of chromosomes. So the gene occurs N times in each generation.Assume each generation is
Chinese restaurant process and random permutations. The Hoppe model from Problem 6.5 can be slightly refined by taking the family structure of the clans into consideration.The balls in Hoppe’s urn are labelled in the order in which they arrived in the urn. The state of the urn after the nth draw
S Hoppe’s urn model and Ewens’ sampling formula. Imagine a gene in a population that can reproduce and mutate at discrete time points, and assume that every mutation leads to a new allele (this is the so-called infinite alleles model). If we consider the genealogical tree of n randomly chosen
Self-fertilisation. Suppose the gene of a plant can come in two ‘versions’, the alleles A anda. A classical procedure to grow pure-bred (i.e. homozygous) plants of genotype AA respectively aa is self-fertilisation. The transition graphdescribes the transition from one generation to the next.
Embedded jump chain. Let E be countable and .Xn0 a Markov chain on E with transition matrix …. Let T0 D 0 and Tk D inf¹n > Tk1 W Xn ¤ Xn1º be the time of the kth jump of .Xn0. Show that the sequence Xk WD XTk , k 0, is a Markov chain with transition matrixShow further that, conditional on
S Functions of Markov chains. Let .Xn/n0 be a Markov chain with countable state space E and transition matrix …, and f W E ! F a mapping from E to another countable set F .(a) Show by example that .f ı Xn/n0 is not necessarily a Markov chain.(b) Find a (non-trivial) condition on f and …
Iterated random functions. Let E be a countable set, .F;F/ an arbitrary event space, f W E F ! E a measurable function, and .Ui /i1 a sequence of i.i.d. random variables taking values in .F;F/. Let .Xn/n0 be recursively defined by X0 D x 2 E, and XnC1 D f .Xn; UnC1/ for n 0. Show that .Xn/n0
Let .Xi /i1 be independent and Cauchy distributed with parameter a > 0 (cf. Problem 2.5), and define Mn D max.X1; : : : ; Xn/. Show that Mn=n converges in distribution to a random variableY > 0, and Y 1 has a Weibull distribution (which one?), see Problem 3.27.(The inverse Weibull distributions
Let .Xi /i1 be independent standard normal random variables andShow that the sequence anMn a2n converges in distribution to the probability measure Q on R with distribution function FQ.c/ D exp.e c/ ; c 2 R:Q is known as the doubly exponential distribution or, after Emil J. Gumbel (1891–1966),
S The arcsine law. Consider the simple symmetric random walk .Sj /j2N introduced in Problem 2.7, for fixed N 2 N. Let L2N D max¹2n 2N W S2n D 0º be the last time both candidates have the same number of votes before the end of the count. (In a more general context, one would speak of the last
Convergence in distribution of discrete random variables. Let X and Xn, n 1, be random variables on the same probability space that take values in Z. Prove that the following statements are equivalent:(a) Xn !d X for n!1.(b) P.Xn D k/ ! P.X D k/ for n!1and every k 2 Z.(c)P k2Z jP.Xn D k/ P.X D
Convergence in probability versus convergence in distribution. Let X and .Xn/n1 be real-valued random variables on the same probability space. Show the following:(a) Xn !P X implies Xn !d X.(b) The converse of (a) does not hold in general, but it does when X is almost surely constant.
S Brownian motion. A heavy particle is randomly hit by light particles, so that its velocity is randomly reversed at equidistant time points. That is, its spatial coordinate (in a given direction) at time t > 0 satisfies Xt D Pbtc iD1 Vi with independent velocities Vi , where P.Vi D˙v/ D 1=2
Error propagation for transformed observations. Let .Xi /i1 be a sequence of i.i.d.random variables taking values in a (possibly unbounded) interval I R, and suppose the variance v D V.Xi/ > 0 exists. Let m D E.Xi / and f W I ! R be twice continuously differentiable with f 0.m/ ¤ 0 and bounded
A company has issued a total of n D 1000 shares. At a fixed time, every shareholder decides for each share with probability 0 < p < 1 to sell it. These decisions are independent for all shares. The market can take in s D 50 shares without the price falling. What is the largest value of p such that
In a sales campaign, a mail order company offers their first 1000 customers a complimentary ladies’ respectively men’s watch with their order. Suppose that both sexes are equally attracted by the offer. How many ladies’ and how many men’s watches should the company keep in stock to ensure
Rounding errors. Estimate the error of a sum of rounded numbers as follows. The numbers R1; : : : ; Rn 2 R are rounded to the next integer, i.e., they can be represented as Ri D Zi C Ui with Zi 2 Z and Ui 2 OE1=2; 1=2OE . The deviation of the sum of rounded numbers Pn iD1 Zi from the true sum Pn
S No-Shows. Frequently, the number of passengers turning up for their flight is smaller than the number of bookings made. This is the reason why airlines overbook their flights(i.e., they sell more tickets than seats are available), at the risk of owing compensation to an eventual surplus of
Effect of the discreteness corrections. Determine a lower bound for the error term in(5.23) when the discreteness corrections ˙1=2 are omitted, by considering the case k D l D np 2 N. Compare the result with Figure 5.6.
Asymptotics of ˆ. Establish the sandwich estimateand hence the asymptotics 1 ˆ.x/ .x/=x for x !1. Hint: Compare the derivatives of the functions on the left- and right-hand sides with . for all x > 0, x (1)> (10-15 ()(x) () (x)
S Local normal approximation of Poisson distributions. Let x.k/ D .k/=p for all> 0 and k 2 ZC. Show that, for any c > 0, Px({k}) lim max 200 keZ+:xx (k)|c (xx (k)) 2-1|=0.
Decisive power of determined minorities. In an election between two candidates A and B one million voters cast their votes. Among these, 2 000 know candidate A from his election campaign and vote unanimously for him. The remaining 998 000 voters are more or less undecided and make their decision
Give a sequence of random variables in L2 for which neither the (strong or weak) law of large numbers nor the central limit theorem holds.
S Asymptotics of the Pólya model. Consider Pólya’s urn model with parameters a D r=c and b D w=c, as introduced in Example (3.14). Let Rn be the number of red balls drawn after n iterations.(a) Use Problem 3.4 and the law of large numbers to show that Rn=n converges in distribution to the beta
Expectation versus probability. Bob suggests the following game to Alice: ‘Here is a biased coin, which shows heads with probability p 2 1=3; 1=2OE. Your initial stake is e 100; each time the coin shows heads, I double your capital, otherwise you pay me half of your capital. Let Xn denote your
Let .Xn/n1 be a sequence of independent random variables that are exponentially distributed with parameter ˛ > 0. Show that lim sup n!1 Xn=log n D 1=˛ and lim inf n!1 Xn=log n D 0 almost surely.
S Inspection, or waiting time, paradox. As in the previous problem, let .Li /i1 be i.i.d.non-negative random variables representing the life times of machines, or light bulbs, which are immediately replaced when defect. In a waiting time interpretation, one can think of the Li as the time spans
S Renewals of, say, light bulbs. Let .Li /i1 be i.i.d. non-negative random variables with finite or infinite expectation. One can interpret Li as the life time of the i th light bulb (which is immediately replaced when it burns out); see also Figure 3.7. For t > 0 letbe the number of bulbs used
Law of large numbers for random variables without expectation. Let .Xi /i1 be i.i.d.real-valued random variables having no expectation, i.e., Xi 62 L1. Let a 2 N be arbitrary.Show the following:(a) P.jXnj > an infinitely often/ D 1. Hint: Use Problem 4.5.(b) For the sums Sn D Pn iD1 Xi we have
S Convexity of the Bernstein polynomials. Let f W OE0; 1! R be continuous and convex.Show that, for every n 1, the corresponding Bernstein polynomial fn is also convex. Hint:Let p1 < p2 < p3, consider the frequencies Zk D Pn iD1 1OE0;pk ı Ui , k D 1; 2; 3, and represent Z2 as a convex
Large deviations of empirical averages from the mean. Let .Xi /i1 be a Bernoulli sequence with 0 P(Xia) e-nhla:p) where h(a; p) = a log+(1-a) log. Hint: Show first that for all s 0 P(za) se e-nas E(esxi)n
S (a) A particle moves randomly on a plane according to the following rules. It moves one unit along a randomly chosen direction ‰1, then it chooses a new direction ‰2 and moves one unit along that new direction, and so on. We suppose that the angles ‰i are independent and uniformly
Collectibles. Consider the problem of collecting a complete series of stickers, as described in Problem 4.20. How many items you have to buy so that with probability at least 0:95 you have collected all N D 20 stickers? Use Chebyshev’s inequality to give a least possible bound.
The Ky Fan metric for convergence in probability. For two real-valued random variables X; Y on an arbitrary probability space let d.X; Y / D min®" 0 W P.jX Y j > "/ "¯:Show the following:(a) The minimum is really attained, and d is a metric on the space of all real-valued random variables,
S Simple symmetric random walk. In the situation of Problem 2.7, let D inf¹2n 2 W S2n D 0ºbe the first time during the count at which there is a tie between the two candidates. Find the generating function and the expectation of .
In the situation of Problem 3.17, use Problem 4.23 to determine the generating function of the number of larvae and hence deduce their distribution.
Consider the situation of Problem 4.23, and let 0 < p < 1. Determine a discrete density p on ZC such that for every r > 0 the following holds. If is Poisson distributed with parameter r log p and the Xi have distribution density p, then S has the negative binomial distribution Br;p. This p is
Let ;X1;X2; : : : 2 L1 be independent random variables taking values in ZC, and let X1;X2; : : : be identically distributed. Suppose S is defined as in Problem 4.10. Show that S has generating function 'SD ' ı 'X1 , and deduceWald’s identity again. Moreover, find the variance V.S / and
Factorial moments. Let X be a random variable with values in ZC and generating function'X, and for each k 2 N let X.k/ D X.X 1/ : : : .XkC1/. Show that '.k/X .1/ D E.X.k// when X 2 Lk.
Collectibles (alternative approach). In the situation of Problem 4.20, prove the recursive formula P.Tr D nC1/ D1 r1 NXn kD1P.Tr1 D k/ P.Tr D k/:Use this to find the generating functions of the Tr and deduce that the distribution of TN is the convolution ıN ?N?rD1 Gr=N. Find the
S Collectibles (coupon collector’s problem). Suppose a company attaches to its product a sticker showing one of the players of the national football team. How many items do you have to buy on average to collect all N D 20 stickers? To formalise the problem, let .Xi /i1 be a sequence of
Positive correlation of monotone random variables. Let . ;F; P/ be a probability space, f; g 2 L2.P /, and X; Y two independent -valued random variables with distribution P. Show that CovP .f; g/ D 1 2 EOEf .X/ f .Y /OEg.X/ g.Y /:In the case . ;F/ D .R;B/, deduce that any two increasing
S Fixed points of a random permutation. Let D Sn be the set of all permutations of ¹1; : : : ; nº and P D U the uniform distribution on . For every permutation ! 2 , let X.!/ be the number of fixed points of !. Find E.X/ and V.X/ (without using Problem 2.11).
Normal moments. Let X be an N0;1-distributed random variable. Show that E.X2k/ D 2k .kC 1 2 /=. 1 2 / for every k 2 N, and calculate the explicit value for k D 1; 2; 3.
Best linear prediction. Suppose X; Y 2 L2 and (without loss of generality) V.X/ D 1.What is the best approximation of Y by an affine function a C bX of X? Show that the quadratic deviation E.Y abX/2is minimised by b D Cov.X; Y / and a D E.Y bX/. What does this mean when X; Y are uncorrelated?
S Show the following. (a) The expectation minimises the mean squared deviation. If X 2 L2 with expectation m and a 2 R, then E.X a/2 V.X/with equality if and only if a D m.(b) Every median minimises the mean absolute deviation. Let X 2 L1, be a median of X and a 2 R. Then E.jX aj/ E.jX
Let X1; : : : ; Xn 2 L2 be i.i.d. random variables and M D 1 nPn iD1 Xi their average.Find IE ( (X-M)). i=1
Let X be a real random variable, and consider the cases(a) X is UOE0;1-distributed,(b) X is Cauchy distributed with density .x/ D 11 1Cx2 ,(c) X D eY for an N0;1-distributed random variable Y .Check whether the expectation E.X/ and the variance V.X/ exist. If so, calculate their values.
S Maturity-dependence of the Black–Scholes price. Let ….N / D E..XN K/C/ be the Black–Scholes price for maturity N in the Cox–Ross–Rubinstein model with parameters; > 0, see Example (4.17). Prove that ….N / ….N C1/. Hint: Use Problem 4.4.
Consider the Cox–Ross–Rubinstein model with parameters X0 D K D 1, D D log 2. For the maturity times N D 1; 2; 3, find the Black–Scholes price … and the optimal self-financing hedging strategy ˛ˇ.
S Wald’s identity. Let .Xi /i1 be i.i.d. real random variables in L1 and let be a ZCvalued random variable with E./ < 1. Suppose that, for every n 2 N, the event ¹ nº is independent of Xn. Show that the random variable S D PiD1 Xi has an expectation, and E.S / D E./E.X1/ :
The magazine of a consumer safety organisation supplements its test articles with the‘average price’ of a product; this is often given as a sample median, i.e., a median of the empirical distribution 1 nPn iD1 ıxi of the prices x1; : : : ; xn found in n shops. Why can the median be more useful
S Fubini’s theorem. Let X1;X2 be independent random variables taking values in some arbitrary event spaces .E1; E1/ and .E2; E2/ respectively, and suppose that f W E1 E2 ! R is a bounded random variable. For x1 2 E1 let f1.x1/ D E.f .x1;X2//, which is well-defined by Problem 1.6. Show: f1 is a
Suppose X; Y;X1;X2; : : : 2 L1. Prove the following statements.(a) Fatou’s lemma. If Xn 0 for every n and X D lim infn!1 Xn, then E.X/ lim infn!1 E.Xn/. Hint: By Problem 1.14, Yn WD infkn Xk is a random variable, and Yn " X.(b) Dominated convergence theorem. If jXnj Y for every n and X D
Let . ;F; P/ be a probability space and An 2 F, n 1. Define and interpret a random variable X with E.X/ D Pn1 P.An/. In particular, discuss the special case that the An are pairwise disjoint.
S (a) Let X be a random variable taking values in ZC. Show that(Both sides can be equal to C1.)(b) Let X be an arbitrary random variable taking values in OE0;1OE. Show that(Again, both sides can be equal to C1.) Hint: Use discrete approximations. E(X) = P(Xk). k1
Jensen’s inequality (J. Jensen 1906). Let ' W R ! R be a convex function, X 2 L1 and ' ı X 2 L1. Show that'E.X/ E'.X/:Hint: Consider a tangent to ' at the point E.X/.
Inclusion–exclusion principle. Give an alternative proof of the inclusion–exclusion principle in Problem 1.7b by calculating the expectation of the product (1-1A,). iJ il\J
Which of the following statements hold for arbitrary X; Y 2 L1? Prove or disprove.(a) E.X/ D E.Y / ) P.X D Y / D 1.(b) E.jX Y j/ D 0 ) P.X D Y / D 1.
Let X be a random variable taking values in OE0;1. Show the following (considering the discrete case first).(a) If E.X/ < 1, then P.X < 1/ D 1.(b) If E.X/ D 0, then P.X D 0/ D 1.
S Oscillations of the simple symmetric random walk, cf. Problem 2.7. Let .Xi /i1 be a sequence of independent random variables which are uniformly distributed on ¹1; 1º, and set Sn D Pn iD1 Xi for n 1. Show that, for all k 2 N, PjSnCk Snj k for infinitely many nD 1 :Deduce that P.jSnj m for
Let .Xk/k1 be a Bernoulli sequence for p 2 0; 1OE. For n; l 2 N, let Al n be the event¹Xn D XnC1 D DXnCl1 D 1º that a run of luck of length at least l starts at time n, and let Al D lim supn!1 Al n. Show that P.T l2N Al / D 1. Hence, with probability 1 there exist infinitely many runs of
S Let Yk, k 1, be OE0;1OE-valued random variables on a probability space . ;F; P/, and consider the eventsWhich of these belong to the tail -algebra T .Yk W k 1/? Decide and give proofs. A = {k1 k < 00}. A3 = {infk>1 Yk < 1}, A = {k1 k < 1}, A4 = {lim infk00 Yk < 1}.
Find all the probability measures P on OE0;1OE satisfying the following property: If n 2 N is arbitrary and X1; : : : ; Xn are independent random variables with identical distribution P, then the random variable n min.X1; : : : ; Xn/ also has distribution P. Hint: Start by finding an equation for
S Failure times. Determine the random life span of a wire rope (or any kind of technical appliance) as follows. For t > 0 let F.t/ WD P.0; t/ be the probability that the rope fails in the time interval 0; t, and suppose P has a Lebesgue density . Suppose further that the conditional probability for
Box–Muller method for sampling from normal distributions, 1958. Let U; V be independent random variables with uniform distribution on 0; 1OE , and define R D p2 logU, X D R cos.2V /, and Y D R sin.2V /. Show that X; Y are independent and N0;1-distributed. Hint: First calculate the distribution
S Construction of the Poisson point process in Rd . Let ƒ Rd be a Borel set with 0 0. Also, let .Xi /i1 be a sequence of i.i.d. random variables with uniform distribution on ƒ, and Nƒ a Poisson random variable with parameter ˛d .ƒ/, which is independent of .Xi /i1. Consider the random points
Let .St /t0 be the compound Poisson process with jump distribution Q and intensity˛ > 0. Show that, for fixed t > 0, St has the distributionHere, Q?0 D ı0 is the Dirac distribution at 0. Q = eat M (at)" 2*n n!
S Comparison of independent Poisson processes. Let .Nt /t0 and . QN t /t0 be two independent Poisson processes with intensities ˛ resp. ˛Q and jump times .Tk/ resp. .TQk/. Show the following.(a) NTQ1 has a geometric distribution (with which parameter?).(b) The random variables NTQkNQ Tk1, k
In a service centre with s different counters, customers arrive at the times of independent Poisson processes .N .i/t /t0 with intensities ˛.i/ > 0, 1 i s. At time t , you observe that a total of n customers is waiting. What is the conditional distribution of the s-tuple of the customers
Let .Nt /t0 be a Poisson process and 0 < s < t. Find the conditional probability P.Ns D kjNt D n/ for 0 k n.
S Bernoulli sequence as a discrete analogue of the Poisson process.(a) Let .Xn1 be a Bernoulli sequence for p 2 0; 1OE and let T0 D 0; Tk D inf¹n > Tk1 W Xn D 1º; Lk D Tk Tk11 for k 1. (Tk is the time of the kth success and Lk the waiting time between the .k1/st and the kth success.) Show
Telegraph process. Let .Nt /t0 be a Poisson process with intensity ˛ > 0 and Zt D.1/Nt . Show that P.Zs D Zt / D .1Ce2˛.ts//=2 for 0 s < t.
Thinning of a Poisson process. Let ˛ > 0, .Li /i1 be a sequence of i.i.d. random variables that are exponentially distributed with parameter ˛, and let Tk D Pk iD1 Li , k 1.Furthermore, let .Xk/k1 be a Bernoulli sequence with parameter p 2 0; 1OE , which is independent of the Li . Show that
Showing 4500 - 4600
of 8686
First
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
Last
Step by Step Answers