Kolmogorov's inequality. Let (left(xi_{n}ight)_{n in mathbb{N}}) be a sequence of independent, identically distributed random variables on a

Question:

Kolmogorov's inequality. Let \(\left(\xi_{n}ight)_{n \in \mathbb{N}}\) be a sequence of independent, identically distributed random variables on a probability space \((\Omega, \mathscr{A}, \mathbb{P})\). Then we have the following generalization of Chebyshev's inequality, see Problem 11.3 (vi),

\[\mathbb{P}\left(\max _{1 \leqslant k \leqslant n}\left|\sum_{k=1}^{n}\left(\xi_{k}-\mathbb{E} \xi_{k}ight)ight| \geqslant tight) \leqslant \frac{1}{t^{2}} \sum_{k=1}^{n} \mathbb{V} \xi_{k}\]

where \(\mathbb{E} \xi=\int \xi d \mathbb{P}\) is the expectation or mean value and \(\mathbb{V} \xi=\int(\xi-\mathbb{E} \xi)^{2} d \mathbb{P}\) the variance of the random variable (i.e. measurable function) \(\xi: \Omega ightarrow \mathbb{R}\).

Data from problem 11.3 (vi)

(vi) \(\mathbb{P}(|\xi-\mathbb{E} \xi| \geqslant \alpha \sqrt{\mathbb{V} \xi}) \leqslant \frac{1}{\alpha^{2}}\), where \((\Omega, \mathscr{A}, \mathbb{P})\) is a probability space, \(\xi\) is a random variable (i.e. a measurable function \(\xi: \Omega ightarrow \mathbb{R}\) ), \(\mathbb{E} \xi=\int \xi d \mathbb{P}\) is the expectation or mean value and \(\mathbb{V} \xi=\int(\xi-\mathbb{E} \xi)^{2} d P\) is the variance.

Remark. This is Chebyshev's inequality.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  answer-question
Question Posted: