# Question

(a) Prove that any sequence that converges in the mean square sense must also converge in probability. Use Markov’s inequality.

(b) Prove by counterexample that convergence in probability does not necessarily imply convergence in the mean square sense.

(b) Prove by counterexample that convergence in probability does not necessarily imply convergence in the mean square sense.

## Answer to relevant Questions

A discrete random process, X[n], is generated by repeated tosses of a coin. Let the occurrence of a head be denoted by 1 and that of a tail by - 1. A new discrete random process is generated by Y [2n] = X [n] for n = 0, ± ...Let X (t) be a modified version of the random telegraph process. The process switches between the two states X (t) = 1 and X (t) = –1 with the time between switches following exponential distributions, fT (λs) = λexp ...Consider a discrete- time wide sense stationary random processes whose autocorrelation function is of the form Assume this process has zero- mean. Is the process ergodic in the mean? Let, Xk , k = 1,2,3,…., be a sequence of IID random variables with mean and variance . Form the sample mean process σ2x.From the sample mean process (a) Find the mean function, µS [n] = E [S [n]]. (b) Find the ...Let X (t) be a Poisson counting process with arrival rate, λ. We form two related counting processes, Y1 (t) and Y2 (t), by deterministically splitting the Poisson process, X (t). Each arrival associated with X (t) is ...Post your question

0