All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Ask a Question
Search
Search
Sign In
Register
study help
business
introduction to statistical investigations
Questions and Answers of
Introduction To Statistical Investigations
Consider an arbitrary probability measure space \((\Omega, \mathcal{F}, P)\) and let \(X_{r}\) be the collection of all possible random variables \(X\) that map \(\Omega\) to \(\mathbb{R}\) subject
Within the context of Exercise 1, let \(\|X\|_{r}\) be defined for \(X \in X_{r}\) as\[\|X\|_{r}=\left[\int_{\Omega}|X(\omega)|^{r} d P(\omega)ight]^{1 / r}\]Prove that \(\|X\|_{r}\) is a norm. That
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a distribution \(F\) that has parameter \(\theta\). Let \(\hat{\theta}_{n}\) be an unbiased
Consider a sequence of random variables \(\left\{X_{n}ight\}_{n=1}^{\infty}\) where \(X_{n}\) has probability distribution function\[f_{n}(x)= \begin{cases}{[\log (n+1)]^{-1}} & x=n \\ 1-[\log
Suppose that \(\left\{X_{n}ight\}_{n=1}^{\infty}\) is a sequence of independent random variables from a common distribution that has mean \(\mu\) and variance \(\sigma^{2}\), such that
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of random variables that converge in \(r^{\text {th }}\) mean to a random variable \(X\) as \(n ightarrow \infty\) for some \(r>0\). Prove that
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent random variables where \(X_{n}\) has probability distribution function\[f_{n}(x)= \begin{cases}1-n^{-\alpha} & x=0 \\
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of random variables such that \(P\left(\left|X_{n}ight| \leq Yight)=1\) for all \(n \in \mathbb{N}\) where \(Y\) is a positive integrable
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of random variables such that\[E\left(\sup _{n \in \mathbb{N}}\left|X_{n}ight|ight)
Prove that if \(a, x\), and \(y\) are positive real numbers then\[2 \max \{x, y\} \delta\{2 \max \{x, y\} ;(a, \infty)\} \leq 2 x \delta\left\{x ;\left(\frac{1}{2}a, \inftyight)ight\}+2 y
Suppose that \(\left\{X_{n}ight\}_{n=1}^{\infty}\) is a sequence of random variables such that \(X_{n} \xrightarrow{r} X\) as \(n ightarrow \infty\) for some random variable \(X\). Prove that for
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent random variables where \(X_{n}\) has an \(\operatorname{ExpONEntial}\left(\theta_{n}ight)\) distribution for all \(n \in
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent random variables where \(X_{n}\) has a Triangular \(\left(\alpha_{n}, \beta_{n}, \gamma_{n}ight)\) distribution for all \(n \in
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent random variables where \(X_{n}\) has a \(\operatorname{BETA}\left(\alpha_{n}, \beta_{n}ight)\) distribution for all \(n \in
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of random variables where \(X_{n}\) has distribution \(F_{n}\) which has mean \(\theta_{n}\) for all \(n \in \mathbb{N}\). Suppose that\[\lim
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of random variables such that \(X_{n}\) has distribution \(F_{n}\) for all \(n \in \mathbb{N}\). Suppose that \(X_{n} \xrightarrow{\text { a.c.
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of random variables such that \(X_{n}\) has a \(\mathrm{N}\left(0, \sigma_{n}^{2}ight)\) distribution, conditional on \(\sigma_{n}\). For each
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of random variables such that \(X_{n}\) has a \(\mathrm{N}\left(\mu_{n}, \sigma_{n}^{2}ight)\) distribution, conditional on \(\mu_{n}\) and
Write a program in \(\mathrm{R}\) that simulates a sequence of independent and identically distributed random variables \(X_{1}, \ldots, X_{100}\) where \(X_{n}\) follows a distribution \(F\) that is
Write a program in \(\mathrm{R}\) that simulates a sequence of independent and identically distributed random variables \(B_{1}, \ldots, B_{500}\) where \(B_{n}\) is a
Write a program in \(\mathrm{R}\) that simulates a sequence of independent random variables \(X_{1}, \ldots, X_{100}\) where \(X_{n}\) as probability distribution function\[f_{n}(x)=
Write a program in \(\mathrm{R}\) that simulates a sequence of independent random variables \(X_{1}, \ldots, X_{100}\) where \(X_{n}\) is a \(\mathrm{N}\left(0, \sigma_{n}^{2}ight)\) random variable
Write a program in \(\mathrm{R}\) that simulates a sequence of independent random variables \(X_{1}, \ldots, X_{100}\) where \(X_{n}\) is a \(\mathrm{N}\left(\mu_{n}, \sigma_{n}^{2}ight)\) random
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent random variables where \(X_{n}\) has a \(\operatorname{GAmma}\left(\theta_{n}, 2ight)\) distribution where
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent random variables where \(X_{n}\) has a \(\operatorname{BERNOULLi}\left(\theta_{n}ight)\) distribution where
In the context of Theorem 6.1 (Lindeberg, Lévy, and Feller), prove that Equation (6.3) implies Equation (6.2). Theorem 6.1 (Lindeberg, Lvy, and Feller). Let {X} be a sequence of independent random
Prove Corollary 6.2. That is, let \(\left\{\left\{X_{n k}ight\}_{k=1}^{n}ight\}_{n=1}^{\infty}\) be a triangular array where \(X_{11}, \ldots, X_{n 1}\) are mutually independent random variables for
Let \(\left\{\left\{X_{n, k}ight\}_{k=1}^{n}ight\}_{n=1}^{\infty}\) be a triangular array of random variables where \(X_{n, k}\) has a \(\operatorname{BERnOulli}\left(\theta_{n, k}ight)\)
Prove Theorem 6.4. That is, suppose that \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of random variables such that \(\sigma_{n}^{-1}\left(X_{n}-\muight) \xrightarrow{d} Z\) as \(n ightarrow
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of random variables such that \(\sigma_{n}^{-1}\left(X_{n}-\muight) \xrightarrow{d} Z\) as \(n ightarrow \infty\) where \(Z\) is a
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of random variables such that \(\sigma_{n}^{-1}\left(X_{n}-\muight) \xrightarrow{d} Z\) as \(n ightarrow \infty\) where \(Z\) is a
Let \(\left\{X_{n}ight\}_{n=1}\) be a set of independent and identically distributed random variables from a distribution with me \(\mu\) and finite variance \(\sigma^{2}\). Show that\[S^{2}=n^{-1}
In Example 6.6, find \(\boldsymbol{\Lambda}\) and \(\mathbf{d}^{\prime}(\theta) \boldsymbol{\Lambda} \mathbf{d}(\theta)\). Example 6.6. Let {X} be a sequence of independent and identically
Let \(\left\{\mathbf{X}_{n}ight\}\) be a sequence of \(d\)-dimensional random vectors where \(\mathbf{X}_{n} \xrightarrow{d} \mathbf{Z}\) as \(n ightarrow \infty\) where \(\mathbf{Z}\) has a
Let \(\left\{\mathbf{X}_{n}ight\}\) be a sequence of \(d\)-dimensional random vectors where \(\mathbf{X}_{n} \xrightarrow{d} \mathbf{Z}\) as \(n ightarrow \infty\) where \(\mathbf{Z}\) has a
Let \(\left\{\mathbf{X}_{n}ight\}\) be a sequence of \(d\)-dimensional random vectors where \(\mathbf{X}_{n} \xrightarrow{d} \mathbf{Z}\) as \(n ightarrow \infty\) where \(\mathbf{Z}\) has a
Let \(\left\{\mathbf{X}_{n}ight\}_{n=1}^{\infty}\) be a sequence of two-dimensional random vectors where \(\mathbf{X}_{n} \xrightarrow{d} \mathbf{Z}\) as \(n ightarrow \infty\) where \(\mathbf{Z}\)
Let \(\left\{\mathbf{X}_{n}ight\}_{n=1}^{\infty}\) be a sequence of three-dimensional random vectors where \(\mathbf{X}_{n} \xrightarrow{d} \mathbf{Z}\) as \(n ightarrow \infty\) where \(\mathbf{Z}\)
Write a program in \(\mathrm{R}\) that simulates 1000 samples of size \(n\) from an EXPONENTIAL(1) distribution. On each sample compute \(n^{1 / 2}\left(\bar{X}_{n}-1ight)\) and \(n^{1 /
Write a program in \(\mathrm{R}\) that simulates 1000 observations from a Multino\(\operatorname{MIAL}(n, 3, \mathbf{p})\) distribution where \(\mathbf{p}^{\prime}=\left(\frac{1}{4}, \frac{1}{4},
Write a program in \(\mathrm{R}\) that simulates 1000 sequences of independent random variables of length \(n\) where the \(k^{\text {th }}\) variable in the sequence has an
Write a program in \(\mathrm{R}\) that simulates 1000 samples of size \(n\) from a UNI\(\operatorname{FORm}\left(\theta_{1}, \theta_{2}ight)\) distribution where \(n, \theta_{1}\), and \(\theta_{2}\)
Let \(f\) be a real function and define the Fourier norm as Feller (1971) does as\[(2 \pi)^{-1} \int_{-\infty}^{\infty}|f(x)| d x\]For a fixed value of \(x\), is this function a norm?
Prove that the Fourier transformation of \(H_{k}(x) \phi(x)\) is \((i t)^{k} \exp \left(-\frac{1}{2} t^{2}ight)\). Hint: Use induction and integration by parts as in the partial proof of Theorem 7.1.
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables where \(X_{n}\) has a \(\operatorname{Gamma}(\alpha, \beta)\) distribution for all
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables where \(X_{n}\) has a \(\operatorname{BETA}(\alpha, \beta)\) distribution for all
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables where \(X_{n}\) has a density that is a mixture of two NORMAL densities of the form
Prove Theorem 7.8. That is, let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables from a distribution \(F\). Let \(F_{n}(t)=P\left[n^{1 /
Let \(\left\{R_{n}ight\}_{n=1}^{\infty}\) be a sequence of real numbers such that \(R_{n}=o\left(n^{-1}ight)\) as \(n ightarrow \infty\). Prove that \(R_{n}^{2}=o\left(n^{-1}ight)\) as \(n ightarrow
Suppose that \(v_{1}(\alpha)\) and \(v_{2}(\alpha)\) are constant with respect to \(n\). Prove that if \(R_{n}=\left[n^{-1 / 2} v_{1}(\alpha)+n^{-1} v_{2}(\alpha)+o\left(n^{-1}ight)ight]^{2}\) then a
Suppose that \(g_{\alpha, n}=v_{0}(\alpha)+n^{-1 / 2} v_{1}(\alpha)+n^{-1} v_{2}(\alpha)+o\left(n^{-1}ight)\) as \(n ightarrow \infty\) where \(v_{0}(\alpha), v_{1}(\alpha)\), and \(v_{2}(\alpha)\)
Prove that\[\int_{-\infty}^{\infty} \exp (t x) \phi(x) H_{k}(x) d x=t^{k} \exp \left(\frac{1}{2} t^{2}ight)\]
Suppose that \(X_{1}, \ldots, X_{n}\) are a set of independent and identically distributed random variables from a distribution \(F\) that has mean equal to zero, unit variance, and cumulant
Suppose that \(X_{1}, \ldots, X_{n}\) are a set of independent and identically distributed random variables from a distribution \(F\) that has mean equal to\(\theta\), variance equal to
Using Equation (7.28), prove that \(-r_{1}(x)=\frac{1}{6} ho_{3} H_{2}(x)\) and \[-r_{2}(x)=\frac{1}{24} ho_{4} H_{3}(x)+\frac{1}{72} ho_{3}^{2} H_{5}(x) .\] d (x)pk(x) = (x)rk(x), dx (7.28)
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables from a distribution \(F\). Let \(F_{n}(t)=P\left[n^{1 / 2}
Let \(\left\{W_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables from a distribution \(F\) with mean \(\eta\) and variance \(\theta\). Prove that
In the context of Example 7.8, let \(\left\{W_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed bivariate random vectors from a distribution \(F\) having mean vector
Prove that the polynomials given in Equations (7.44) and (7.46) reduce to those given in Equations (7.29) and (7.30), when \(\theta\) is taken to be the univariate mean. -11(x) = 103H2(x) (7.29)
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables from a distribution \(F\) that has density \(f\), characteristic function
Let \(X\) be a random variable with moment generating function \(m(u)\) and cumulant generating function \(c(u)\). Assuming that both functions exist, prove that\[\left.\frac{d^{2}}{d u^{2}}
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables following an EXPONENTIAL(1) density.a. Prove that if
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables following an \(\operatorname{Gamma}(\alpha, \beta)\) density.a. Find the value of
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables following an \(\operatorname{WALD}(\alpha, \beta)\) density.a. Find the value of
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables following a \(\operatorname{ChiSquared}(\theta)\) distribution. In Example 7.12 we
Write a program in \(\mathrm{R}\) that generates \(b\) samples of size \(n\) from a specified distribution \(F\) (specified below). For each sample compute the statistic \(Z_{n}=n^{1 / 2}
Write a program in \(\mathrm{R}\) that generates \(b\) samples of size \(n\) from a specified distribution \(F\) (specified below). For each sample compute the statistic \(Z_{n}=n^{1 / 2}
Write a program in \(\mathrm{R}\) that generates \(b\) samples of size \(n\) from the linear density \(f(x)=2[\theta+x(1-2 \theta)] \delta\{x ;(0,1)\}\) studied in Example 7.2. Recall that the first
Write a program in \(\mathrm{R}\) that generates \(b\) samples of size \(n\) from a density that is a mixture of two NORmaL densities given by \(f(x)=\frac{1}{2} \phi(x)+\frac{1}{2} \phi(x-\theta)\)
Write a program in \(\mathrm{R}\) that generates \(b\) samples of size \(n\) from a specified distribution \(F\) (specified below). For each sample compute the approximate \(100 \alpha \%\)
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables following a \(\mathrm{N}\left(\mu, \sigma^{2}ight)\) distribution. A saddlepoint
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables following a \(\operatorname{ChiSquared}(\theta)\) distribution. A saddlepoint
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent random variables where \(X_{n}\) has a \(\mathrm{N}\left(0, n^{-1}ight)\) distribution for all \(n \in \mathbb{N}\). Prove that
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) and \(\left\{Y_{n}ight\}_{n=1}^{\infty}\) be sequences of random variables. Suppose that \(X_{n}=o_{p}\left(Y_{n}ight)\) as \(n ightarrow \infty\). Prove
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables where \(X_{n}=n^{-1} U_{n}\) where \(U_{n}\) has a \(\operatorname{Uniform}(0,1)\)
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) and \(\left\{Y_{n}ight\}_{n=1}^{\infty}\) be sequences of independent random variables. Suppose that \(Y_{n}\) is a \(\operatorname{BETA}\left(\alpha_{n},
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) and \(\left\{Y_{n}ight\}_{n=1}^{\infty}\) be sequences of independent random variables. Suppose that \(Y_{n}\) is a \(\operatorname{POISSON}(\theta)\) random
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent random variables where \(X_{n}\) has a \(\operatorname{GAmma}\left(\alpha_{n}, \beta_{n}ight)\) distribution for all \(n \in
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent random variables where \(X_{n}\) has a \(\operatorname{GEOmetric}\left(\theta_{n}ight)\) distribution where
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) and \(\left\{Y_{n}ight\}_{n=1}^{\infty}\) be sequences of independent random variables, where \(X_{n}\) has a \(\operatorname{Uniform}(0, n)\) distribution
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) and \(\left\{Y_{n}ight\}_{n=1}^{\infty}\) be sequences of random variables and let \(\left\{y_{n}ight\}_{n=1}^{\infty}\) be a sequence of real numbers.a.
Suppose that \(\left\{W_{n}ight\}_{n=1}^{\infty}\) is a sequence of independent random variables such that \(W_{n}\) has a \(\mathrm{N}\left(\theta, \sigma^{2}ight)\) distribution for all \(n \in
Let \(\left\{B_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent random variables where \(B_{n}\) has a \(\operatorname{Bernoulli}(\theta)\) distribution for all \(n \in \mathbb{N}\). Define a
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables from a distribution \(F\) with \(k^{\text {th }}\) central moment given by
Write a program in \(\mathrm{R}\) that first simulates 1000 observations from a PoIS\(\operatorname{SON}(10)\) distribution. For each observation, simulate a \(\operatorname{BinOmial}\left(n,
Write a program in \(\mathrm{R}\) that simulates two sequences of random variables. The first sequence is given by \(X_{1}, \ldots, X_{100}\) where \(X_{n}\) has a \(\operatorname{Uniform}(0, n)\)
Write a program in \(\mathrm{R}\) that simulates 1000 samples of size 100 from a distribution \(F\), where \(F\) is specified below. For each sample compute the second and third sample central
Write a program in \(\mathrm{R}\) that simulates samples of size \(n=1, \ldots, 100\) from a distribution \(F\), where \(F\) is specified below. For each sample compute the fourth sample central
Write a program in \(\mathrm{R}\) that simulates the sequence \(\left\{X_{n}ight\}_{n=1}^{100}\) where \(X_{n}\) has a \(\operatorname{Geometric}\left(\theta_{n}ight)\) distribution where the
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables from a distribution \(F\). Let \(\theta\) be defined as the \(p^{\text {th }}\)
Consider a functional of the form\[T(F)=\int_{-\infty}^{\infty} \int_{-\infty}^{\infty} h\left(x_{1}, x_{2}ight) d F\left(x_{1}ight) d F\left(X_{2}ight)\]where \(F \in \mathcal{F}\), a collection of
Consider the skewness functional given by\[T(F)=\int_{-\infty}^{\infty}\left[t-\int_{-\infty}^{\infty} t d F(t)ight]^{3} d F(t)\]where \(F \in \mathcal{F}\), a collection of distribution functions
Consider the \(k^{\text {th }}\) moment functional given by\[T(F)=\int_{-\infty}^{\infty} t^{k} d F(t)\]where \(F \in \mathcal{F}\), a collection of distribution functions where \(\mu_{k}^{\prime}a.
Consider the \(k^{\text {th }}\) central moment functional given by\[T(F)=\int_{-\infty}^{\infty}\left[t-\int_{-\infty}^{\infty} t d F(t)ight]^{k} d F(t)\]where \(F \in \mathcal{F}\), a collection of
Let \(F\) be a symmetric and continuous distribution function and consider the functional parameter \(\theta\) that corresponds to the expectation\[(1-2 \alpha)^{-1} E\left(X \delta\left\{X
Let \(X\) be a random variable following a distribution \(F(x \mid \theta)\) where \(\theta \in \Omega\). Assume that \(F(x \mid \theta)\) has continuous density \(f(x \mid \theta)\). The maximum
Let \(G \in \mathcal{F}\) and let \(F\) be a fixed distribution from \(\mathcal{F}\). Consider a functional of the form\[I(G)=\int_{-\infty}^{\infty} \cdots \int_{-\infty}^{\infty} t\left(x_{1},
Consider a functional of the form\[T(F)=\int_{-\infty}^{\infty} \cdots \int_{-\infty}^{\infty} h\left(x_{1}, \ldots, x_{r}ight) \prod_{i=1}^{r} d F\left(x_{i}ight)\]where \(F \in \mathcal{F}\), a
Let \(t\left(x_{1}, \ldots, x_{r}ight)\) be a real valued function. Prove that\[I(G)=\int_{-\infty}^{\infty} \ldots \int_{-\infty}^{\infty} t\left(x_{1}, \ldots, x_{r}ight) \prod_{i=1}^{r}
Write a program in \(\mathrm{R}\) that simulates 100 samples of size \(n\) from distributions that are specified below. Let \(T(F)\) be the variance functional described in Example 9.8. For each
Showing 1 - 100
of 1041
1
2
3
4
5
6
7
8
9
10
11