Question: A probability distribution of a random variable can be defined by its pdf or cdf. Another option is the characteristic function. The characteristic function of

A probability distribution of a random variable can be defined by its pdf or cdf. Another option is the characteristic function. The characteristic function of a random variable X is the expected value of eitx Q x (t) = E(ex ) If you're not familiar with complex numbers, no worries, i is just the square root of -1, that's all we'll need to know about it here. Note that the definition looks similar to the moment generating function, M x (t) = E(ex ) but the characteristic function and the moment generating function are different. In many cases, the characteristic function allow for easier operations with random variables or for an easier way to prove theorems. For example, the classical proof of the central limit theorem uses the characteristic function. A convenient property of the characteristic function is that the characteristic function of a sum of independent random variables X and Y is the product of the characteristic functions X and Y, Qx +Y (t) = 4x(t)Qx(t) Consider an example, let X and Y be independent uniformly distributed random variables from 0 to 1 and from 1 to 2, correspondingly. The pdf of W = X + Y can be found from the pdfs of X and Y using the convolution formula: fw ( W ) = [ fx ( x ) f, ( w - x ) dx Applying this formula, we get (similar to example in module 4): 1? if w 6
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
