New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
statistical techniques in business
Testing Statistical Hypotheses 3rd Edition Erich L. Lehmann, Joseph P. Romano - Solutions
Let X and Y be independently distributed with Poisson distributions P(λ) and P(µ). Find the power of the UMP unbiased test of H : µ ≤ λ, against the alternatives λ = .1, µ = .2; λ = 1, µ = 2; λ = 10, µ = 20; λ = .1,µ = .4; at level of significance α = .1.[Since T = X + Y has the
Negative binomial. Let X, Y be independently distributed according to negative binomial distributions Nb(p1, m) and Nb(p2, n) respectively, and let qi = 1 − pi.(i) There exists a UMP unbiased test for testing H : θ = q2/q1 ≤ θ0 and hence in particular H : p1 ≤ p2.(ii) Determine the
The singly truncated normal (STN) distribution, indexed by parameters ν and λ has support the positive real line with density p(x; ν, λ) = C(ν, λ) exp(−νx − λx2) , where C(ν, λ) is a normalizing constant. Based on an i.i.d. sample, show there exists a UMPU test of the null hypothesis
The UMP unbiased tests of the hypotheses H1,...,H4 of Theorem 4.4.1 are unique if attention is restricted to tests depending on U and the T’s.
Continuation. The function φ4 defined by (4.16), (4.18), and(4.19) is jointly measurable in u and t.[The proof, which otherwise is essentially like that outlined in the preceding problem, requires the measurability in z and t of the integral g(z, t) = z−−∞u dFt(u).This integral is
Measurability of tests of Theorem 4.4.1. The function φ3 defined by (4.16) and (4.17) is jointly measurable in u and t.[With C1 = v and C2 = w, the determining equations for v, w, γ1, γ2 are Ft(v−) + [1 − Ft(w)] + γ1[Ft(v) − Ft(v−)] (4.26)+γ2[Ft(w) − Ft(w−)] = αand Gt(v−) + [1
Suppose P{I = 1} = p = 1 − P{I = 2}. Given I = i, X ∼N(θ, σ2 i ), where σ2 1 < σ2 2 are known. If p = 1/2, show that, based on the data(X, I), there does not exist a UMP test of θ = 0 vs θ > 0. However, if p is also unknown, show a UMPU test exists. [See Examples 10.20-21 in Romano and
Random sample size. Let N be a random variable with a powerseries distribution P(N = n) = a(n)λn C(λ) , n = 0, 1,... (λ > 0, unknown).When N = n, a sample X1,...,Xn from the exponential family (3.19) is observed.On the basis of (N,X1,...,XN ) there exists a UMP unbiased test of H : Q(θ) ≤
Let X, Y , Z be independent Poisson variables with means λ, µ, v. Then there exists a UMP unbiased test of H : λµ ≤ v2.
Let Xi(i = 1, 2) be independently distributed according to distributions from the exponential families (3.19) with C, Q, T, and h replaced by Ci, Qi, Ti, and hi. Then there exists a UMP unbiased test of(i) H : Q2(θ2) − Q1(θ1) ≤ c and hence in particular of Q2(θ2) ≤ Q1(θ1);(ii) H : Q2(θ2)
Let X1,...,Xn be a sample from the uniform distribution over the integers 1,...,θ and let a be a positive integer.(i) The sufficient statistic X(n) is complete when the parameter space is Ω ={θ : θ ≤ a}.(ii) Show that X(n) is not complete when Ω = {θ : θ ≥ a}, a ≥ 2, and find a
Let X, Y be independent binomial b(p, m) and b(p2, n) respectively. Determine whether (X, Y ) is complete when(i) m = n = 1,(ii) m = 2, n = 1.
Determine whether T is complete for each of the following situations:(i) X1,...,Xn are independently distributed according to the uniform distribution over the integers 1, 2,...,θ and T = max(X1,...,Xn).(ii) X takes on the values 1,2,3,4 with probabilities pq, p2q, pq2, 1 − 2pq respectively, and
The completeness of the order statistics in Example 4.3.4 remains true if the family F is replaced by the family F1 of all continuous distributions.[Due to Fraser (1956). To show that for any integrable symmetric function φ, φ(x1,...,xn) dF(x1) ...dF(xn) = 0 for all continuous F implies φ = 0
Counterexample. Let X be a random variable taking on the values −1, 0, 1, 2, . . . with probabilities Pθ{X = −1} = θ; Pθ{X = x} = (1 − θ)2θx, x = 0, 1,....Then P = {Pθ, 0
Let X1,...,Xm and Y1,...,Yn. be samples from N(ξ, σ2) and N(ξ, τ 2). Then T = (Xi,Yj ,X2 i ,Y 2 j ), which in Example 4.3.3 was seen not to be complete, is also not boundedly complete.[Let f(t) be 1 or −1 as ¯y − x¯ is positive or not.]
Let X1,...,Xn be a sample from (i) the normal distribution N(aσ, σ2), with a fixed and 0
For testing the hypothesis H : θ = θ0, (θ0 an interior point of Ω)in the one-parameter exponential family of Section 4.2, let C be the totality of tests satisfying (4.3) and (4.5) for some −∞ ≤ C1 ≤ C2 ≤ ∞ and 0 ≤ γ1, γ2 ≤ 1.(i) C is complete in the sense that given any
Let (X, Y ) be distributed according to the exponential family dPθ1,θ2 (x, y) = C(θ1, θ2)eθ1x+θ2y dµ(x, y) .The only unbiased test for testing H : θ1 ≤a, θ2 ≤ b against K : θ1 > a or θ2 > b or both is φ(x, y) ≡ α.[Take a = b = 0, and let β(θ1, θ2) be the power function of any
Let X and Y be independently distributed according to oneparameter exponential families, so that their joint distribution is given by dPθ1,θ2 (x, y) = C(θ1)eθ1T (x) dµ(x)K(θ2)eθ2U(y) dν(y).Suppose that with probability 1 the statistics T and U each take on at least three values and that
Suppose X has density (with respect to some measure µ)pθ(x) = C(θ) exp[θT(x)]h(x) , for some real-valued θ. Assume the distribution of T(X) is continuous under θ(for any θ). Consider the problem of testing θ = θ0 versus θ = θ0. If the null hypothesis is rejected, then a decision is to be
Let Tn/θ have a χ2-distribution with n degrees of freedom. For testing H : θ = 1 at level of significance α = .05, find n so large that the power of the UMP unbiased test is ≥ .9 against both θ ≥ 2 and θ ≤ 1 2 . How large does n have to be if the test is not required to be unbiased?
Let X have the Poisson distribution P(τ ), and consider the hypothesis H : τ = τ0. Then condition (4.6) reduces to C2−1 x=C1+1τx−1 0(x − 1)! e−τ0 +2 i=1(1 − γi) τ Ci−1 0(Ci − 1)! e−τ0 = 1 − α, provided C1 > 1.
Let X have the binomial distribution b(p, n), and consider the hypothesis H : p = p0 at level of significance α. Determine the boundary values of the UMP unbiased test for n = 10 with α = .1, p0 = .2 and with α = .05, p0 = .4, and in each case graph the power functions of both the unbiased and
p-values. Consider a family of tests of H : θ = θ0 (or θ ≤ θ0), with level-α rejection regions Sα, such that (a) Pθ0 {X ∈ Sα} for all 0
Admissibility. Any UMP unbiased test φ0, is admissible in the sense that there cannot exist another test φ1 which is at least as powerful as φ0 against all alternatives and more powerful against some.[If φ is unbiased and φ is uniformly at least as powerful as φ, then φ is also unbiased.]
In Example 3.9.3, provide the details for Cases 3 and 4.
In Example 3.9.2, Case 2, verify the claim for the least favorable distribution.
Suppose (X1,...,Xk) has the multivariate normal distribution with unknown mean vector ξ = (ξ1,...,ξk) and known covariance matrix Σ.Suppose X1 is independent of (X2,...,Xk). Show that X1 is partially sufficient for ξ1 in the sense of Problem 3.60. Provide an alternative argument for Case 2 of
Suppose X is a k × 1 random vector with E(|X|2) < ∞ and covariance matrix Σ. Let A be an m × k (nonrandom) matrix and let Y = AX.Show Y has mean vector AE(X) and covariance matrix AΣAT .
Let X1,...,Xm; Y1,...,Yn be independently, normally distributed with means ξ and η, and variances a σ2 and τ 2 respectively, and consider the hypothesis H : τ ≤ σ a against K : σ
Let X1,...,X and Y1,...,Yn be independent samples from N(ξ, 1) and N(η, 1), and consider the hypothesis H : η ≤ ξ against K : η>ξ.There exists a UMP test, and it rejects the hypothesis when Y¯ − X¯ is too large.[If ξ1 < η1, is a particular alternative, the distribution assigning
Sufficient statistics with nuisance parameters.(i) A statistic T is said to be partially sufficient for θ in the presence of a nuisance parameter η if the parameter space is the direct product of the set of possible θ- and η-values, and if the following two conditions hold: (a)the conditional
Let X and Y be the number of successes in two sets of n binomial trials with probabilities p1 and p2 of success.(i) The most powerful test of the hypothesis H : p2 ≤ p1 against an alternative(p1, p2) with p1 < p2 and p1+p2 = 1 at level α < 1 2 rejects when Y −X>C and with probability γ
A counterexample. Typically, as α varies the most powerful levelα tests for testing a hypothesis H against a simple alternative are nested in the sense that the associated rejection regions, say Rα, satisfy Rα ⊂ Rα , for any α
Confidence bounds for a median. Let X1,...,Xn be a sample from a continuous cumulative distribution functions F. Let ξ be the unique median of F if it exists, or more generally let ξ = inf{ξ : F(ξ) = 1 2 }.(i) If the ordered X’s are X(1) < ··· < X(n), a uniformly most accurate lower
Let the variables Xi(i = 1,...,s) be independently distributed with Poisson distribution P(λi). For testing the hypothesis H :λj ≤ a (for example, that the combined radioactivity of a number of pieces of radioactive material does not exceed a), there exists a UMP test, which rejects when Xj
Letf, g be two probability densities with respect to µ. For testing the hypothesis H : θ ≤ θ0 or θ ≥ θ1(0 < θ0 < θ1 < 1) against the alternativesθ0
For testing the hypothesis H : θ1 ≤ θ ≤ θ2(θ1 ≤ θ2) against the alternatives θθ2, or the hypothesis θ = θ0 against the alternativesθ = θ0, in an exponential family or more generally in a family of distributions satisfying the assumptions of Problem 3.53, a UMP test does not
Extension of Theorem 3.7.1. The conclusions of Theorem 3.7.1 remain valid if the density of a sufficient statistic T (which without loss of generality will be taken to be X), say pθ(x), is STP3 and is continuous in x for eachθ.[The two properties of exponential families that are used in the proof
STP3. Let θ and x be real-valued, and suppose that the probability densities pθ(x) are such that pθ (x)/pθ(x) is strictly increasing in x forθ 0, let g(x) = k1pθ1 (x) − k2pθ2 (x) + k3pθ3 (x).If g(x1) − g(x3) = 0, then the function g is positive outside the interval (x1, x3)and negative
Exponential families. The exponential family (3.19) with T(x) =x and Q(θ) = θ is STP∞, with Ω the natural parameter space and X = (−∞, ∞).[That the determinant |eθixj |, i, j = 1,...,n, is positive can be proved by induction. Divide the ith column by eθ1xi , i = 1,...,n; subtract in
Totally positive families. A family of distributions with probability densities pθ(x), θ and x real-valued and varying over Ω and X respectively, is said to be totally positive of order r(TPr) if for all x1 < ··· < xn andθ1 < ··· < θnn =%%%%pθ1 (x1) ··· pθ1 (xn)pθn (x1) ···
For a random variable X with binomial distribution b(p, n), determine the constants Ci, γ(i = 1, 2) in the UMP test (3.31) for testing H : p ≤ .2 or ≤ .7 when α = .1 and n = 15. Find the power of the test against the alternative p = .4.
Let F1,...,Fm+1 be real-valued functions defined over a space U. A sufficient condition for u0 to maximize Fm+1 subject to Fi(u) ≤ ci(i =1,...,m) is that it satisfies these side conditions, that it maximizes Fm+1(u) −kiFi(u) for some constants ki ≥ 0, and that Fi(uo) = ci for those values i
The following example shows that Corollary 3.6.1 does not extend to a countably infinite family of distributions. Let pn be the uniform probability density on [0, 1+1/n], and p0 the uniform density on (0, 1).(i) Then p0 is linearly independent of (p1, p2,...), that is, there do not exist constants
Optimum selection procedures. On each member of a population n measurements (X1,...,Xn) = X are taken, for example the scores of n aptitude tests which are administered to judge the qualifications of candidates for a certain training program. A future measurement Y such as the score in a final test
If β(θ) denotes the power function of the UMP test of Corollary 3.4.1, and if the function Q of (3.19) is differentiable, then β(θ) > 0 for all θ for which Q(θ) > 0.[To show that β(θ0) > 0, consider the problem of maximizing, subject to Eθ0 φ(X) = α, the derivative β(θ0) or
that Eθ[L(θ, θ)] = Pθ{θ∗ ≤ θ}L(θ, u)dF(u)≤ Pθ{θ∗ ≤ θ}L(θ, u)dF∗(u) = Eθ[L(θ, θ∗)].]Section 3.6
Confidence bounds with minimum risk. Let L(θ, θ) be nonnegative and nonincreasing in its second argument for θ < θ, and equal to 0 for θ ≥ θ.If θ and θ∗ are two lower confidence bounds for θ such that P0{θ ≤ θ} ≤ Pθ{θ∗ ≤ θ} for all θ ≤ θ, then EθL(θ, θ) ≤
(i) For n = 5, 10 and 1−α = .95, graph the upper confidence limits ¯p and ¯p∗ of Example 3.5.2 as functions of t = x + u.(ii) For the same values of n and α1 = α2 = .05, graph the lower and upper confidence limits p and ¯p.
Let f(x)/[1 − F(x)] be the “mortality” of a subject at time x given that it has survived to this time. A c.d.f. F is said to be smaller than G in the hazard ordering if g(x)1 − G(x) ≤ f(x)1 − F(x) for all x . (3.46)(i) Show that (3.46) is equivalent to 1 − F(x)1 − G(x) is
Let F and G be two continuous, strictly increasing c.d.f.s, and let k(u) = G[F −1(u)], 0
F0, F1 are two cumulative distribution functions on the real line, then Fi(x) ≤ F0(x) for all x if and only if E0ψ(X) ≤ E1ψ(X) for any nondecreasing function ψ.
Extension of Lemma 3.4.2. Let P0 and P1 be two distributions with densities p0, p1 such that p1(x)/p0(x) is a nondecreasing function of a realvalued statistic T(x).(i) If T has probability density pi when the original distribution of Pi, then p1(t)/p0(t) is nondecreasing in t.(ii) E0ψ(T) ≤
Let X1, ··· , Xn be a sample from a location family with common density f(x−θ), where the location parameter θ ∈ R and f(·) is known. Consider testing the null hypothesis that θ = θ0 versus an alternative θ = θ1 for some θ1 >θ0. Suppose there exists a most powerful level α test of
Let X1,...,Xn be a sample from the inverse Gaussian distribution I(µ, τ ) with density 1 τ2πx3 exp− τ2xµ2 (x − µ)2, x> 0, τ,µ > 0.Show that there exists a UMP test for testing(i) H : µ ≤ µ0 against µ>µ0 when τ is known;(ii) H : τ ≤ τ0 against τ>τ0 when µ is known.In each
Consider a single observation X from W(1, c).(i) The family of distributions does not have monotone likelihood ratio in x.(ii) The most powerful test of H : c = 1 against c = 2 rejects when Xk2. Show how to determine k1 and k2.(iii) Generalize (ii) to arbitrary alternatives c1 > 1, and show that a
A random variable X has the Weibull distribution W(b,c) if its density is cbx bc−1 e−(x/b)c, x> 0,b, c > 0.(i) Show that this defines a probability density.(ii) If X1,...,Xn is a sample from W(b, c), with the shape parameter c known, show that there exists a UMP test of H : b ≤ b0 against
Let X1,...,Xn be a sample from the gamma distribution Γ(g, b)with density 1Γ(g)bg xg−1 e−x/b, 0 < x, 0 b0 when g is known;(ii) H : g ≤ g0 against g>g0 when b is known.In each case give the form of the rejection region.
Let Xi be independently distributed as N(i∆, 1), i = 1,...,n.Show that there exists a UMP test of H : ∆ ≤ 0 against K : ∆ > 0, and determine it as explicitly as possible. Note. The following problems (and some of the Additional Problems in later chapters) refer to the gamma, Pareto,
Let X be a single observation from the Cauchy density given at the end of Section 3.4.(i) Show that no UMP test exists for testing θ = 0 against θ > 0.(ii) Determine the totality of different shapes the MP level-α rejection region for testing θ = θ0 against θ = θ1 can take on for varying α
Let X = (X1,...,Xn) be a sample from the uniform distribution U(θ, θ + 1).(i) For testing H : θ ≤ θ0 against K : θ>θ0 at level α there exists a UMP test which rejects when min(X1,...,Xn) > θ0+C(α) or max(X1,...,Xn >θ0 + 1 for suitable C(α).(ii) The family U(θ, θ+1) does not have
When a Poisson process with rate λ is observed for a time interval of length τ , the number X of events occurring has the Poisson distribution P(λτ ). Under an alternative scheme, the process is observed until r events have occurred, and the time T of observation is then a random variable such
Let X1,...,Xn be independently distributed with density(2θ)−1e−x/2θ, x ≥ 0, and let Y1 ≤ ··· ≤ Yn be the ordered X’s. Assume that Y1 becomes available first, then Y2, and so on, and that observation is continued until Yr has been observed. On the basis of Y1,...,Yr it is desired to
Let the probability density pθ of X have monotone likelihood ratio in T(x), and consider the problem of testing H : θ ≤ θ0 against θ>θ0.If the distribution of T is continuous, the p-value ˆp of the UMP test is given by pˆ = Pθ0 {T ≥ t}, where t is the observed value of T. This holds
(i) A necessary and sufficient condition for densities pθ(x)to have monotone likelihood ratio in x, if the mixed second derivative∂2 log pθ(x)/∂θ ∂x exists, is that this derivative is ≥ 0 for all θ and x.(ii) An equivalent condition is that pθ(x)∂2pθ(x)∂θ ∂x ≥
Let X be the number of successes in a n independent trials with probability p of success, and let φ(x) be the UMP test (3.16) for testing p ≤ p0 against p>p0 at level of significance α.(i) For n = 6, p0 = .25 and the levels α = .05, .1, .2 determine C and γ, and the power of the test against
(i) If ˆp is uniform on (0, 1), show that −2 log(ˆp) has the Chisquared distribution with 2 degrees of freedom.(ii) Suppose ˆp1,..., pˆs are i.i.d. uniform on (0, 1). Let F = −2 log(ˆp1 ··· pˆs). Argue that F has the Chi-squared distribution with 2s degrees of freedom. What can you say
Under the setup of Lemma 3.3.1, show that there exists a realvalued statistic T(X) so that the rejection region is necessarily of the form (3.45).[Hint: Let T(X) = −pˆ.]
Under the setup of Lemma 3.3.1, suppose the rejection regions are defined by Sα = {X : T(X) ≥ k(α)} (3.45)for some real-valued statistic T(X) and k(α) satisfying supθ∈ΩH Pθ{T(X) ≥ k(α)} ≤ α .Then, show pˆ = sup θ∈ΩH P{T(X) ≥ t} , where t is the observed value of T(X).
Suppose X has a continuous distribution function F. Show that F(X) is uniformly distributed on (0, 1). [The transformation from X to F(X) is known as the probability integral transformation.]
In Example 3.21, show that p-value is indeed given by ˆp =pˆ(X) = (11 − X)/10. Also, graph the c.d.f. of ˆp under H and show that the last inequality in (3.15) is an equality if and only u is of the form 0,..., 10.
is admissible.Section 3.3
Let fθ, θ ∈ Ω, denote a family of densities with respect to a measure µ. (We assume Ω is endowed with a σ-field so that the densities fθ(x)are jointly measurable in θ and x.) Consider the problem of testing a simple null hypothesis θ = θ0 against the composite alternatives ΩK = {θ
Suppose X1,...,Xn are i.i.d. N(ξ, σ2) with σ known. For testingξ = 0 versus ξ = 0, the average power of a test φ = φ(X1,...,Xn) is given by ∞−∞Eξ(φ)dΛ(µ) , where Λ is a probability distribution on the real line. Suppose that Λ is symmetric about 0; that is, Λ{E} = Λ{−E} for
Under the setup of Theorem 3.2.1, show there always exists MP tests that are nested in the sense of Problem 3.17(iii).
A counterexample. Typically, as α varies the most powerful levelα tests for testing a hypothesis H against a simple alternative are nested in the sense that the associated rejection regions, say Rα, satisfy Rα ⊂ Rα , for anyα
Based on X with distribution indexed by θ ∈ Ω, the problem is to test θ ∈ ω versus θ ∈ ω. Suppose there exists a test φ such that Eθ[φ(X)] ≤ βfor all θ in ω, where β
it is sufficient for P.]
Fully informative statistics. A statistic T is fully informative if for every decision problem the decision procedures based only on T form an essentially complete class. If P is dominated and T is fully informative, then T is sufficient. [Consider any pair of distributions P0, P1 ∈ P with
If the sample space X is Euclidean and P0, P1 have densities with respect to Lebesgue measure, there exists a nonrandomized most powerful test for testing P0 against P1 at every significance level α.12 [This is a consequence of Theorem 3.2.1 and the following lemma.13 Let f ≥ 0 and A f(x) dx =a.
The following example shows that the power of a test can sometimes be increased by selecting a random rather than a fixed sample size even when the randomization does not depend on the observations. Let X1,...,Xn be independently distributed as N(θ, 1), and consider the problem of testing H : θ =
Let X1,...,Xn be independently distributed, each uniformly over the integers 1, 2,...,θ. Determine whether there exists a UMP test for testing H : θ = θ0, at level 1/θn 0 against the alternatives (i) θ>θ0; (ii) θ
(i) For testing H0 : θ = 0 against H1 : θ = θ1 when X is N(θ, 1), given any 0
In the notation of Section 3.2, consider the problem of testing H0 : P = P0 against H1 : P = P1, and suppose that known probabilities π0 = πand π1 = 1 − π can be assigned to H0 and H1 prior to the experiment.(i) The overall probability of an error resulting from the use of a test ϕ
Let X be distributed according to Pθ, θ ∈ Ω, and let T be sufficient for θ. If ϕ(X) is any test of a hypothesis concerning θ, then ψ(T) given byψ(t) = E[ϕ(X) | t] is a test depending on T only, an its power function is identical with that of ϕ(X).
to obtain UMP tests of (a) H : τ = τ0 against τ = τ0 when b is known; (b) H : c = c0,τ = τ against c>c0, τ
A random variable X has the Pareto distribution P(c, τ ) if its density is cτ c/xc+1, 0 < τ < x, 0 < C.(i) Show that this defines a probability density.(ii) If X has distribution P(c, τ ), then Y = log X has exponential distribution E(ξ,b) with ξ = log τ , b = 1/c.(iii) If X1,...,Xn is a
Let the distribution of X be given by x 01 2 3 Pθ(X = x) θ 2θ .9 − 2θ .1 − θwhere 0
Let P0, P1, P2 be the probability distributions assigning to the integers 1,..., 6 the following probabilities:123456 P0 .03 .02 .02 .01 0 .92 P1 .06 .05 .08 .02 .01 .78 P2 .09 .05 .12 0 .02 .72 Determine whether there exists a level-α test of H : P = P0 which is UMP against the alternatives P1
In the proof of Theorem 3.2.1(i), consider the set of c satisfyingα(c) ≤ α ≤ α(c − 0). If there is only one suchc, c is unique; otherwise, there is an interval of such values [c1, c2]. Argue that, in this case, if α(c) is continuous at c2, then Pi(C) = 0 for i = 0, 1, where C =x : p0(x)
UMP test for exponential densities. Let X1,...,Xn be a sample from the exponential distribution E(a,b) of Problem 1.18, and let X(1) =min(X1,...,Xn).(i) Determine the UMP test for testing H : a = a0 against K : a = a0 when b is assumed known.(ii) The power of any MP level-α test of H : a = a0
Suppose N i.i.d. random variables are generated from the same known strictly increasing absolutely continuous cdf F(·). We are told only X, the maximum of these random variables. Is there a UMP size α test of H0 : N ≤ 5 versus H1 : N > 5?If so, find it.
UMP test for U(0, θ). Let X = (X1,...,Xn) be a sample from the uniform distribution on (0, θ).(i) For testing H : θ ≤ θ0 against K : θ>θ0 any test is UMP at level αfor which Eθ0 φ(X) = α, Eθφ(X) ≤ α for θ ≤ θ0, and φ(x) = 1 when max(x1,...,xn) > θ0.(ii) For testing H : θ =
Let X1,...,Xn be a sample from the normal distribution N(ξ, σ2).(i) If σ = σ0 (known), there exists a UMP test for testing H : ξ ≤ ξ0 againstξ>ξ0, which rejects when (Xi − ξ0) is too large.(ii) If ξ = ξ0 (known), there exists a UMP test for testing H : σ ≤ σ0 against K : σ>σ0,
Let Ω be the natural parameter space of the exponential family(2.35), and for any fixed tr+1,...,tk (r 0.2.9 Notes The theory of measure and integration in abstract spaces and its application to probability theory, including in particular conditional probability and expectation, is treated in a
For any θ which is an interior point of the natural parameter space, the expectations and covariances of the statistics Tj in the exponential family (2.35) are given by E [Tj (X)] = −∂ log C(θ)∂θj(j = 1,...,k), E [Ti(X)Tj (X)] − [ETi(X)ETj (X)] = −∂2 log C(θ)∂θi∂θj(i, j =
Life testing. Let X1,...,Xn be independently distributed with exponential density (2θ)−1e−x/2θ for x ≥ 0, and let the ordered X’s be denoted by Y1 ≤ Y2 ≤ ··· ≤ Yn. It is assumed that Y1 becomes available first, then Y2, and so on, and that observation is continued until Yr has
Let Xi (i = 1,...,s) be independently distributed with Poisson distribution P(λi), and let T0 = Xj , Ti = Xi, λ = λj . Then T0 has the Poisson distribution P(λ), and the conditional distribution of T1,...,Ts−1 given T0 = t0 is the multinomial distribution (2.34) with n = t0 and pi = λi/λ.
Showing 1900 - 2000
of 5757
First
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
Last
Step by Step Answers