All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Ask a Question
Search
Search
Sign In
Register
study help
business
introduction to statistical investigations
Questions and Answers of
Introduction To Statistical Investigations
Write a program in \(\mathrm{R}\) that simulates 100 samples of size \(n\) from distributions that are specified below. Let \(T(F)\) correspond to the quantile functional so that
Prove that \(\operatorname{MSE}(\hat{\theta}, \theta)\) can be decomposed into two parts given by\[\operatorname{MSE}(\hat{\theta}, \theta)=\operatorname{Bias}^{2}(\hat{\theta},
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables from a distribution \(F\) with mean \(\theta\) and variance \(\sigma^{2}\). Suppose
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables from a distribution \(F\) with mean \(\theta\) and finite variance \(\sigma^{2}\).
Let \(B_{1}, \ldots, B_{n}\) be a set of independent and identically distributed random variables from a \(\operatorname{Bernoulli}(\theta)\) distribution. Suppose we are interested in estimating the
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from an \(\operatorname{ExponEntiaL}(\theta)\) distribution. Suppose that we are interested in
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a \(\operatorname{Poisson}(\theta)\) distribution. Suppose that we are interested in estimating
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables following a \(\mathrm{N}\left(\theta, \sigma^{2}ight)\) distribution and consider
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a \(\mathrm{N}\left(\theta, \sigma^{2}ight)\) distribution where \(\sigma^{2}\) is finite. Let
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables from a \(\operatorname{LAPlace}(\theta, 1)\) distribution. Let \(\hat{\theta}_{n}\)
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables following a mixture of two NORMAL distributions, with a density given by \(f(x)=\frac{1}{2}
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a \(\operatorname{Poisson}(\theta)\) distribution. Consider two estimators of
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables following a \(\mathrm{N}(\theta, 1)\) distribution.a. Prove that if \(\theta eq 0\) then
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a \(\operatorname{Gamma}(2, \theta)\) distribution.a. Find the maximum likelihood estimator for
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a \(\operatorname{POISSON}(\theta)\) distribution.a. Find the maximum likelihood estimator for
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a \(\mathrm{N}(\theta, 1)\) distribution.a. Find the maximum likelihood estimator for
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a \(\mathrm{N}(0, \theta)\) distribution.a. Find the maximum likelihood estimator for
Consider a sequence of random variables \(\left\{\left\{X_{i j}ight\}_{j=1}^{k}ight\}_{i=1}^{n}\) that are assumed to be mutually independent, each having a \(\mathrm{N}\left(\mu_{i}, \thetaight)\)
Consider a sequence of independent and identically distributed \(d\)-dimensional random vectors \(\left\{\mathbf{X}_{n}ight\}_{n=1}^{\infty}\) from a \(d\)-dimensional distribution \(F\). Assume the
Suppose that \(X_{1}, \ldots, X_{n}\) is a set of independent and identically distributed random variables from a distribution \(F\) with parameter \(\theta\). Suppose that \(F\) and \(\theta\) fall
Let \(X_{1}, \ldots, X_{n}\) be a sequence of independent and identically distributed random variables from a distribution \(F\) with parameter \(\theta\) and assume the framework of the smooth
Prove that the coverage probability of a \(100 \alpha \%\) upper confidence limit that has an asymptotic expansion of the form\[\hat{\theta}_{n}(\alpha)=\hat{\theta}_{n}+n^{-1 / 2} \hat{\sigma}_{n}
Let \(X_{1}, \ldots, X_{n}\) be a sequence of independent and identically distributed random variables from a distribution \(F\) with parameter \(\theta\) and assume the framework of the smooth
Let \(X_{1}, \ldots, X_{n}\) be a sequence of independent and identically distributed random variables from a distribution \(F\) with parameter \(\theta\) and assume the framework of the smooth
Efron (1981) attempted to improve the properties of the backwards method by adjusting the confidence coefficient to remove some of the bias from the method. The resulting method, called the
Consider the problem of testing the null hypothesis \(H_{0}: \theta \leq \theta_{0}\) against the alternative hypothesis \(H_{1}: \theta>\theta_{0}\) using the test statistic \(Z_{n}=\) \(n^{1 / 2}
Let \(\left\{F_{n}ight\}_{n=1}^{\infty}\) be a sequence of distribution functions such that \(F_{n} \leadsto F\) as \(n ightarrow \infty\) for some distribution function \(F\). Let
Let \(B_{1}, \ldots, B_{n}\) be a sequence of independent and identically distributed random variables from a \(\operatorname{BERNOULLI}(\theta)\) distribution where the parameter space of \(\theta\)
Let \(U_{1}, \ldots, U_{n}\) be a sequence of independent and identically distributed random variables from a \(\operatorname{Uniform}(0, \theta)\) distribution where the parameter space for
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a distribution \(F\) with parameter \(\theta\). Consider testing the null hypothesis \(H_{0}:
Consider the framework of the smooth function model where \(\sigma\), which denotes the asymptotic variance of \(n^{1 / 2} \hat{\theta}_{n}\), is known. Consider using the test statistic \(Z_{n}=n^{1
Consider the framework of the smooth function model where \(\sigma\), which denotes the asymptotic variance of \(n^{1 / 2} \hat{\theta}_{n}\), is unknown, and the test statistic \(T_{n}=n^{1 / 2}
Let \(X_{1}, \ldots, X_{n}\) be a sequence of independent and identically distributed random variables that have an \(\operatorname{ExponentiaL}(\theta)\) distribution for all \(n \in \mathbb{N}\).
In the context of the proof of Theorem 10.12, prove that \(\Lambda_{n}=n\left(\hat{\theta}_{n}-ight.\) \(\left.\theta_{0}ight)^{2} I\left(\theta_{0}ight)+o_{p}\left(n^{-1}ight)\), as \(n ightarrow
Under the assumptions outlined in Theorem 10.11, show that Wald's statistic, which is given by \(Q=n\left(\hat{\theta}_{n}-\theta_{0}ight) I\left(\hat{\theta}_{n}ight)\) where
Under the assumptions outlined in Theorem 10.11, show that Rao's efficient score statistic, which is given by \(Q=n^{-1} U_{n}^{2}\left(\theta_{0}ight) I^{-1}\left(\theta_{0}ight)\) has an asymptotic
Under the assumptions outlined in Theorem 10.11, show that Wald's statistic, \(Q=n\left(\hat{\theta}_{n}-\theta_{0}ight) I\left(\hat{\theta}_{n}ight)\), has an asymptotic ChiSquared \(\left[1,
Under the assumptions outlined in Theorem 10.11, show that Rao's efficient score statistic, which is given by \(Q=n^{-1} U_{n}^{2}\left(\theta_{0}ight) I^{-1}\left(\theta_{0}ight)\) has an asymptotic
Suppose that \(X_{1}, \ldots, X_{n}\) is a set of independent and identically distributed random variables from a continuous distribution \(F\). Let \(\xi \in(0,1)\) and define
Suppose \(X_{1}, \ldots, X_{n}\) is a set of independent and identically distributed random variables from a \(\operatorname{Poisson}(\theta)\) distribution where \(\theta \in \Omega=(0, \infty)\).
Suppose \(X_{1}, \ldots, X_{n}\) is a random sample from an ExPOnEntial location family of densities of the form \(f(x)=\exp [-(x-\theta)] \delta\{x ;[\theta, \infty)\}\), where \(\theta \in
Suppose \(X_{1}, \ldots, X_{n}\) is a random sample from a \(\operatorname{Uniform}(0, \theta)\) density where \(\theta \in \Omega=(0, \infty)\).a. Find a \(100 \alpha \%\) confidence interval for
Let \(\mathbf{X}_{1}, \ldots, \mathbf{X}_{n}\) be a set of independent and identically distributed \(d\) dimensional random vectors from a distribution \(F\) with real valued parameter \(\theta\)
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a \(\mathrm{N}\left(\theta, \sigma^{2}ight)\) distribution conditional on \(\theta\), where
Let \(X\) be a single observation from a discrete distribution with probability distribution function\[f(x \mid \theta)= \begin{cases}\frac{1}{4} \theta & x \in\{-2,-1,1,2\} \\ 1-\theta & x=0 \\ 0 &
Let \(X\) be a single observation from a discrete distribution with probability distribution function\[f(x \mid \theta)= \begin{cases}n^{-1} \theta & x \in\{1,2, \ldots, n\} \\ 1-\theta & x=0 \\ 0 &
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a \(\operatorname{Poisson}(\theta)\) distribution and let \(\theta\) have a
Write a program in \(\mathrm{R}\) that will simulate \(b=1000\) samples of size \(n\) from a \(\mathrm{T}(u)\) distribution. For each sample the compute the sample mean, the sample mean with \(5 \%\)
Write a program in \(\mathrm{R}\) that simulates 1000 samples of size \(n\) from a Pois\(\operatorname{SON}(\theta)\) distribution, where \(n\) and \(\theta\) are specified below. For each sample
Write a program in \(\mathrm{R}\) that simulates 1000 samples of size \(n\) from a distribution \(F\) with mean \(\theta\), where both \(n\) and \(F\) are specified below. For each sample compute two
Write a program in \(\mathrm{R}\) that simulates 1000 samples of size \(n\) from a \(\mathrm{N}(\theta, 1)\) distribution. For each sample compute the sample mean given by \(\bar{X}_{n}\) and the
Write a program in \(\mathrm{R}\) that simulates 1000 samples of size \(n\) from a distribution \(F\) with mean \(\theta\) where \(n, \theta\) and \(F\) are specified below. For each sample compute
Write a program in \(\mathrm{R}\) that simulates 1000 samples of size \(n\) from a distribution \(F\) with mean \(\theta\), where \(n, F\), and \(\theta\) are specified below. For each sample test
The interpretation of frequentist results of Bayes estimators is somewhat difficult because of the sometimes conflicting views of the resulting theoretical properties. This experiment will look at
Let \(\left\{X_{n}ight\}_{n=1}^{\infty}\) be a sequence of independent and identically distributed random variables from a distribution \(F \in \mathcal{F}\) where \(\mathcal{F}\) is the collection
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a continuous distribution \(F\) that is symmetric about a point \(\theta\). Let \(\mathbf{R}\)
Let \(S\) be a linear rank statistic of the form\[S=\sum_{i=1}^{n} c(i) a\left(r_{i}ight)\]If \(\mathbf{R}\) is a vector whose elements correspond to a random permutation of the integers in the set
Consider the rank sum test statistic from Example 11.2, which is a linear rank statistic with \(a(i)=i\) and \(c(i)=\delta\{i ;\{m+1, \ldots, n+m\}\}\) for all \(i=1, \ldots, n+m\). Under the null
Consider the median test statistic described in Example 11.14, which is a linear rank statistic with \(a(i)=\delta\left\{i ;\left\{\frac{1}{2}(m+n+1), \ldots, m+night\}ight\}\) and \(c(i)=\delta\{i
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a distribution \(F\) with continuous and bounded density \(f\), that is assumed to be symmetric
Prove that\[\int_{-\infty}^{\infty} f^{2}(x) d x\]equals \(\frac{1}{2} \pi^{-1 / 2}, 1, \frac{1}{4}, \frac{1}{6}\), and \(\frac{2}{3}\) for the \(\mathrm{N}(0,1),
Prove that the square efficacy of the \(t\)-test equals \(1,12, \frac{1}{2}, 3 \pi^{-2}\), and 6 for the \(\mathrm{N}(0,1), \operatorname{Uniform}\left(-\frac{1}{2}, \frac{1}{2}ight),
Prove that the square efficacy of the sign test equals \(2 \pi^{-1}, 4,1, \frac{1}{4}\), and 4 for the \(\mathrm{N}(0,1), \operatorname{Uniform}\left(-\frac{1}{2}, \frac{1}{2}ight)\), LaPlace
Consider the density \(f(x)=\frac{3}{20} 5^{-1 / 2}\left(5-x^{2}ight) \delta\left\{x ;\left(-5^{1 / 2}, 5^{1 / 2}ight)ight\}\). Prove that \(E_{V}^{2} E_{T}^{-2} \simeq 0.864\), which is a lower
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a discrete distribution with distribution function \(F\) and probability distribution function
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a discrete distribution with distribution function \(F\) and probability distribution function
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a distribution \(F\) with continuous density \(f\). Prove that the histogram estimate with fixed
Prove that the mean integrated squared error can be written as the sum of the integrated square bias and the integrated variance. That is, prove that \(\operatorname{MISE}\left(\bar{f}_{n},
Using the fact that the pointwise bias of the histogram is given by,\[\operatorname{Bias}\left[\bar{f}_{n}(x)ight]=\frac{1}{2} f^{\prime}(x)\left[h-2\left(x-g_{i}ight)ight]+O\left(h^{2}ight),\]as \(h
Let \(f\) be a density with at least two continuous and bounded derivatives and let \(g_{i}
Given that the asymptotic mean integrated squared error for the histogram with bin width \(h\) is given by\[\operatorname{AMISE}\left(\bar{f}_{n}, fight)=(n h)^{-1}+\frac{1}{12} h^{2}
Let \(K\) be any non-decreasing right-continuous function such that\[\begin{gathered}\lim _{t ightarrow \infty} K(t)=1 \\\lim _{t ightarrow-\infty} K(t)=0\end{gathered}\]and\[\int_{-\infty}^{\infty}
Use the fact that the pointwise bias of the kernel density estimator with bandwidth \(h\) is given by\[\operatorname{Bias}\left[\tilde{f}_{n, h}(x)ight]=\frac{1}{2} h^{2} f^{\prime \prime}(x)
Using the fact that the asymptotic mean integrated squared error of the kernel estimator with bandwidth \(h\) is given by,\[\operatorname{AMISE}\left(\tilde{f}_{n, h}, fight)=(n h)^{-1}
Consider the Epanechnikov kernel given by \(k(t)=\frac{3}{4}(1-t)^{2} \delta\{t ;[-1,1]\}\). Prove that \(\sigma_{k} R(k)=3 /(5 \sqrt{5})\).
Compute the efficiency of each of the kernel functions given below relative to the Epanechnikov kernel.a. The Biweight kernel function, given by \(\frac{15}{16}\left(1-t^{2}ight)^{2} \delta\{t
Let \(\hat{f}_{n}(t)\) denote a kernel density estimator with kernel function \(k\) computed on a sample \(X_{1}, \ldots, X_{n}\). Prove that,\[E\left[\int_{-\infty}^{\infty} \hat{f}_{h}(t) f(t) d
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a distribution \(F\) with mean \(\theta\). Let \(R_{n}(\hat{\theta}, \theta)=n^{1 /
Let \(X_{1}, \ldots, X_{n}\) be a set of independent and identically distributed random variables from a distribution \(F\) with mean \(\mu\). Define \(\theta=g(\mu)=\mu^{2}\), and let
Let \(\mathbf{X}_{1}, \ldots, \mathbf{X}_{n}\) be a set of two-dimensional independent and identically distributed random vectors from a distribution \(F\) with mean vector \(\boldsymbol{\mu}\). Let
In the context of the development of the bias corrected and accelerated bootstrap confidence interval, prove
Write a program in \(\mathrm{R}\) that simulates 1000 samples of size \(n\) from a distribution \(F\) with location parameter \(\theta\), where \(n, F\) and \(\theta\) are specified below. For each
Write a program in \(\mathrm{R}\) that simulates five samples of size \(n\) from a distribution \(F\), where \(n\) and \(F\) are specified below. For each sample compute a histogram estimate of the
Write a program in \(\mathrm{R}\) that simulates five samples of size \(n\) from a distribution \(F\), where \(n\) and \(F\) are specified below. For each sample compute a kernel density estimate of
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) be a sequence of real numbers defined by\[x_{n}=\left\{\begin{array}{rl}-1 & n=1+3(k-1), k \in \mathbb{N} \\0 & n=2+3(k-1), k \in \mathbb{N} \\1 &
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) be a sequence of real numbers defined by\[x_{n}=\frac{n}{n+1}-\frac{n+1}{n},\]for all \(n \in \mathbb{N}\). Compute\[\liminf _{n ightarrow \infty}
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) be a sequence of real numbers defined by \(x_{n}=n^{(-1)^{n}-n}\) for all \(n \in \mathbb{N}\). Compute\[\liminf _{n ightarrow \infty} x_{n},\]and\[\limsup
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) be a sequence of real numbers defined by \(x_{n}=n 2^{-n}\), for all \(n \in \mathbb{N}\). Compute\[\liminf _{n ightarrow \infty} x_{n}\]and\[\limsup _{n
Each of the sequences given below converges to zero. Specify the smallest value of \(n_{\varepsilon}\) so that \(\left|x_{n}ight|n_{\varepsilon}\) as a function of \(\varepsilon\).a.
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) and \(\left\{y_{n}ight\}_{n=1}^{\infty}\) be sequences of real numbers such that\[\lim _{n ightarrow \infty} x_{n}=x\]and\[\lim _{n ightarrow \infty}
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) and \(\left\{y_{n}ight\}_{n=1}^{\infty}\) be sequences of real numbers such that \(x_{n} \leq y_{n}\) for all \(n \in \mathbb{N}\). Prove that if the limit
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) and \(\left\{y_{n}ight\}_{n=1}^{\infty}\) be sequences of real numbers such that\[\lim _{n ightarrow \infty}\left(x_{n}+y_{n}ight)=s\]and\[\lim _{n ightarrow
Find the supremum and infimum limits for each sequence given below.a. \(x_{n}=(-1)^{n}\left(1+n^{-1}ight)\)b. \(x_{n}=(-1)^{n}\)c. \(x_{n}=(-1)^{n} n\)d. \(x_{n}=n^{2} \sin ^{2}\left(\frac{1}{2} n
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) be a sequence of real numbers.a. Prove that\[\inf _{n \in \mathbb{N}} x_{n} \leq \liminf _{n ightarrow \infty} x_{n} \leq \limsup _{n ightarrow \infty} x_{n}
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) and \(\left\{y_{n}ight\}_{n=1}^{\infty}\) be a sequences of real numbers such that \(x_{n} \leq y_{n}\) for all \(n \in \mathbb{N}\). Prove that\[\liminf _{n
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) and \(\left\{y_{n}ight\}_{n=1}^{\infty}\) be a sequences of real numbers such that\[\left|\limsup _{n ightarrow \infty} x_{n}ight|
Let \(\left\{x_{n}ight\}_{n=1}^{\infty}\) and \(\left\{y_{n}ight\}_{n=1}^{\infty}\) be a sequences of real numbers such that \(x_{n}>0\) and \(y_{n}>0\) for all \(n \in \mathbb{N}\),\[0
Let \(\left\{f_{n}(x)ight\}_{n=1}^{\infty}\) and \(\left\{g_{n}(x)ight\}_{n=1}^{\infty}\) be sequences of real valued functions that converge pointwise to the real functions \(f\) and \(g\),
Let \(\left\{f_{n}(x)ight\}_{n=1}^{\infty}\) and \(\left\{g_{n}(x)ight\}_{n=1}^{\infty}\) be sequences of real valued functions that converge uniformly on \(\mathbb{R}\) to the real functions \(f\)
Let \(\left\{f_{n}(x)ight\}_{n=1}^{\infty}\) be a sequence of real functions defined by \(f_{n}(x)=\frac{1}{2} n \delta\{x ;(n-\) \(\left.\left.n^{-1}, n+n^{-1}ight)ight\}\) for all \(n \in
Showing 100 - 200
of 1041
1
2
3
4
5
6
7
8
9
10
11