New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
statistical sampling to auditing
Testing Statistical Hypotheses 2nd Edition E. L. Lehmann - Solutions
19. Positive dependence. Two random variables (X, Y) with c.d.I, F(x, y) are said to be positively quadrant dependent if F(x, y) F(x, oo)F(oo , y) for all x, y .• For the case that (X, Y) takes on the four pairs of values (0,0), (0,1), (1,0), (1,1) with probabilities Poo, P01' P\O, Pl1'(X, Y) are
18. Sequential comparison of two binomials. Consider two sequences of binomial trials with probabilities of success PI and P2 respectively, and let P = (P2/q2) -:- (PI/ql)' (i) If a < fJ , no test with fixed numbers of trials m and n for testing H: P = Po can have power fJ against all alternatives
16. The UMP unbiased tests of the hypotheses HI' " '' H4 of Theorem 3 are unique if attention is restricted to tests depending on U and the T 's.#!#17. Let X and Y be independently distributed with Poisson distributions P(A) and P(p.). Find the power of the UMP unbiased test of H: Po 5 A, against
15. Continuation. The function 1/14 defined by (16), (18), and (19) is jointly measurable in u and t.[The proof, which otherwise is essentially like that outlined in the preceding problem, requires the measurability in z and t of the integral g(z,t) = r- udF,(u) . -00 This integral is absolutely
14. Measurability of testsof Theorer: 3. The function l/J3 defined by (16) and (17) is jointly measurable in u and t. [With C1 = v and C2 = w, the determining equations for v. W , 'YI ' 'Y2 are (25) F,( v -) + [1 - F,( w)] + 'YI [ F,( v) - F, (v - )] +'Y2[F,(W) - F,(w -)] = a and (26) G,(v - ) + [1
13. Determine whether T is complete for each of the following situations: (i) XI' . .. , X" are independently distributed according to the uniform distribution over the integers 1,2•. . . , 8 and T = max( XI' . . . , X,,). (ii) X takes on the values 1.2.3.4 with probabilities pq,p2q• pq2,1 -
12. The completeness of the order statistics in Example 6 remains true if the family $' is replaced by the family of all continuous distributions. [To show that for any integrable symmetric function l/J. !l/J(x1,... , x,,) dF(xl ) dF(x,,) = 0 for all continuous F implies l/J = 0 a.e., replace F by
11. Counterexample. Let X be a random variable taking on the values -1,0.1,2, . . . with probabilities Po{ X = -I} = 8; P8{X= x} = (1- 8)28x, x = 0.1 •. . . . Then gJ = {P8 , 0 < 8 < I} is boundedly complete but not complete.
10. Let Xi' ''' ' Xm and Y\,... , y" be samples from Na, ( 2 ) and »«. or 2 ) . Then T = (EXi,Dj,[Xi2,D?), which in Example 5 was seen not to be complete, is also not boundedly complete. [Let f(t) be 1 or -1 as ji - :x is positive or not.]
9. Let X\ , .. . , X; be a sample from (i) the normal distribution N(aa, ( 2 ), with a fixed and 0 < a < 00; (ii) the uniform distribution U( °-!,° + !), - 00 < ° < 00 ; (iii) the uniform distribution U(O\, 02)' -00 < 0\ < O2 < 00 . For these three families of distributions the following
8. For testing the hypothesis H: °= 00 (00 an interior point of 0) in the one-parameter exponential family of Section 2, let rc be the totality of tests satisfying (3) and (5) for some - 00 s C\ s C2 s 00 and 0 s 1\, 12 s 1. (i) rc is complete in the sense that given any level-a test 4'0 of H
7. Let (X, Y) be distributed according to the exponential family dP8r-82( X, y) = C( °1 , 02)e8.x+82Y d",(x, y). The only unbiased test for testing H : 01 sa, 02 s b against K : 01 > a or 02 > b or both is t/I(x, y) == a . "For counterexamples when the conditions of the problem are not satisfied,
6. Let X and Y be independently distributed according to one-parameter exponential families, so that their joint distribution is given by dP8! . 82 ( x, y) = C( °1) e 8•T(x) d",( x) K( 02) e 82U( y ) dJl(y ). Suppose that with probability 1 the statistics T and U each take on at least three
5. Let T,,/O have a X2-distribution with n degrees of freedom. For testing H : °= 1 at level of significance a = .05, find n so large that the power of the UMP unbiased test is .9 against both ° 2 and ° s t. How large does n have to be if the test is not required to be unbiased?
4. Let X have the Poisson distribution P( 'T), and consider the hypothesis H : 'T = 'To . Then condition (6) reduces to To-1 x-C+1 (x-1)! 1-1 -To + (1 - Y) (C - 1)! C-1 eo=1 1-a, provided C > 1.provided C1 > 1.
3. Let X have the binomial distribution b(p, n), and consider the hypothesis H :p = Po at level of significancea. Determine the boundary values of the UMP unbiased test for n = 10 with a = .1, Po = .2 and with a = .05, Po = .4, and in each case graph the power functions of both the unbiased and the
2. p-values. Consider a family of tests of H : 8 = 80 (or 8 :5; 80 ) , with level-a rejection regions Sa such that (a) PSo{ X E Sa} = a for all 0 < a < 1, and (b) Sao = (Ia>aoSa for all 0 < ao < 1, which in particular implies Sa C Sa' for a < a' . (i) Then the p-value a is given by a = a(x) = inf{
1. Admissibility. Any UMP unbiased test 4>0 is admissible in the sense that there cannot exist another test 4>1 which is at least as powerful as 4>0 against all alternatives and more powerful against some. [If 4> is unbiased and 4>' is uniformly at least as powerful as 4>, then 4>' is also
53. Let [, g be two probability densities with respect to JL. For testing the hypothesis H: 0 :5 00 or 0 01 (0 < 00 < 0\ < 1) against the alternatives 00 < 0 < 01 in the family 9= (()f( x) + (1 - O)g(x), 0:5 ():5 I}, the test
52. Let XI" '" X" be independent N( (J, y), 0 < y < 1 known, and Y1, · · · , Y" independent N((J,I). Then X is more informative than Y according to the definition at the end of Section 4.[If V, is N(O, 1 - y), then X; + V, has the same distribution as .l Note. If a is unknown, it is not true
51. Let Xl " ' " X, be i.i.d. with density Po or PI' so that the MP level-a test of H : Po rejects when Il;'= I r( Xi) c,,, where r( X,) = PI ( X,)/Po( Xi)' or equivalently when (34) 1 In O)ogr(x,) - Eo[Jogr(x,)]} i ; (i) It follows from the central limit theorem (Chapter 5, Theorem 3) that under
50. (i) For testing Ho : (J = 0 against HI : (J = (JI when X is N((J, 1), given any o < a < 1 and any 0 < 'IT < 1 (in the notation of the preceding problem), there exists (JI and x such that (a) Ho is rejected when X = x but (b) P(Holx) is arbitrarily close to 1. (ii) The paradox of part (i) is due
49. In the notation of Section 2, consider the problem of testing Ho : P = Po against HI : P = PI' and suppose that known probabilities 'ITo = 'IT and 'lT1 = 1 - 'iT can be assigned to Ho and HI prior to the experiment. (i) The overall probability of an error resulting from the use of a test cp is
48. Let X be distributed according to P9 ' (J E n, and let T be sufficient for (J. If q>(X) is any test of a hypothesis concerning (J, then "'(T) given by "'(I) = E(q>(X)It] is a test depending on T only, an its power function is identical with that of q>(X).
47. Let Xl" ' " Xn be a sample from the inverse Gaussian distribution I(p., 1') with density / l' exp( _. _1'(X _ p./), 2'1Tx 3 2xp.2 X> 0, 1', P. > O. Show that there exists a UMP test for testing (i) H: p. .s P.o against p. > p.o when l' is known; (ii) H: l' TO against l' > TO when p. is known.
46. Consider a single observation X from W(l, c). (i) The family of distributions does not have monotone likelihood ratio in x. (ii) The most powerful test of H : C = 1 against C = 2 rejects when X < kl and when X> k 2 • Show how to determine kl and k 2 • (iii) Generalize (ii) to arbitrary
45. A random variable X has the Weibull distribution W(b,c) if its density is C(X) "-l b t e -(x/b l ', X> 0,b, C > O. (i) Show that this defines a probability density. (ii) If Xl"'" Xn is a sample from W(b, c), with the shape parameter C known, show that there exists a UMP test of H : b bo against
44. A random variable X has the Pareto distribution P( C, T) if its density is CT'/X" +I,O < T < X, 0
43. Let Xl' . . . ' X" be a sample from the gamma distribution I'(g,b) with density 1 f( g) bKxg -1e - ' Ih, O bo when g is known; (ii) H: g s go against g> go when b is known. In each case give the form of the rejection region.
42. Let X, be independently distributed as N(itJ., 1), i = 1, .. . , n. Show that there exists a UMP test of H : tJ. .s 0 against K : tJ. > 0, and determine it as explicitly as possible. Note. The following problems (and some of the Additional Problems in later chapters) refer to the gamma, Pareto,
41. Let Xl' . . .' x" be independently distributed, each uniformly over the integers 1,2, . . . , 8. Determine whether there exists a UMP test for testing H : 8 = 80 at level 1/86' against the alternatives (i) 8 > 80 ; (ii) 8 < 80 ; (iii) 8 "* 80 ,
40. Let the distribution of X be given by 'xH 0 1 2 3 where 0 < 8 < .1. For testing H : 8 = .05 against 8 > .05 at level a = .05, determine which of the following tests (if any) is UMP : (i) q,(0) = I, q,(1) = q,(2) = q,(3) = 0; (ii) q,(1) = .5, q,(0) = q,(2) = q,(3) = 0; (iii) q,(3) = 1, q,(0) =
39. Let Po, PI' P2 be the probability distributions assigning to the integers 1, . . . ,6 the following probabilities: 1 2 3 4 5 6 Po .03 .02 .02 .01 0 .92 PI .06 .05 .08 .02 .01 .78 P2 .09 .05 .12 0 .02 .72 Determine whether there exists a level-a test of H : P = Po which is UMP against the
38. Let Xl" ' " Xm ; Yl , ... , y" be independently, normally distributed with means E and 1/, and variances a2 and 'T 2 respectively, and consider the hypothesis H:'T S a against K: a < 'T. (i) If E and 1/ are known, there exists a UMP test given by the rejection region E( lj - 1/)2/E( X; - E)2 C.
37. Let XI, ,, ,,Xm and lj, ... , y" be independent samples from N(tl) and N( 1/,1), and consider the hypothesis H : 1/ S Eagainst K : 1/ > E. There exists a UMP test, and it rejects the hypothesis when Y- Xis too large. [If EI < 1/1 is a particular alternative, the distribution assigning
36. Sufficient statistics with nuisance parameters. (i) A statistic T is said to be partially sufficient for 8 in the presence of a nuisance parameter 1/ if the parameter space is the direct product of the set of possible 8- and 1/-values, and if the following two conditions hold: (a) the
35. Let X and Y be the number of successes in two sets of n binomial trials with probabilities PI and P2 of success. (i) The most powerful test of the hypothesis H: P2 :S PI against an alternative (PI' P2) with PI < P2 and PI + P2 = 1 at level a < t rejects when Y - X > C and with probability y
34. A counterexample. Typically, as a varies the most powerful level-a tests for testing a hypothesis H against a simple alternative are nested in the sense that the associated rejection regions, say Ra, satisfy Ra eRa' for any a < a' . This relation always holds when H is simple, but the following
33. Confidence bounds for a median. Let XI" ' " Xn be a sample from a continuous cumulative distribution function F. Let be the unique median of F if it exists, or more generally let = inf{~' : F( 0 = t}. (i) If the ordered X's are X(I) < . . . < X(n)' a uniformly most accurate lower confidence
32. Let the variables X; (i = 1, . . . , s) be independently distributed with Poisson distribution P( i)' For testing the hypothesis H :'L~j .s a (for example, that the combined radioactivity of a number of pieces of radioactive material does not exceed a), there exists a UMP test, which rejects
31. For testing the hypothesis H': 81 s 8 s 82 (81 s 82 ) against the alternatives 8 < 81 or 8 > 82 , or the hypothesis 8 = 80 against the alternatives 8 oF 80 , in an exponential family or more generally in a family of distributions satisfying the assumptions of Problem 30, a UMP test does not
30. Extension of Theorem 6. The conclusions of Theorem 6 remain valid if the density of a sufficientstatistic T (which without loss of generality will be taken to be X), say PB( X), is STP3 and is continuous in x for each 8. [The two properties of exponential families that are used in the proof of
29. STP3 • Let 8 and x be real-valued, and suppose that the probability densities Pe ( x) are such that Pe :( x )jPe ( x) is strictly increasing in x for 8 < 8' . Then the following two conditions are equivalent: (a) For 8. < 82 < 83 and k l , k 2 , k3 > 0, let g(x) = klpu,(x) - k 2PU2 ( X) + k
28. Exponential families . The exponential family (12) with T(x) = x and Q(8) = 8 is STPoo' with 0 the natural parameter space and !l' = (- 00, 00). [That the determinant leUjxJI, i, j = 1, . . . , n, is positive can be proved by induction. Divide the ith column by eU'x" i = 1, . . . , n; subtract
27. Totally positive families. A family of distributions with probability densities Pe(x), 8 and x real-valued and varying over n and !!£ respectively, is said to be totally positive of order r (TPr ) if for all XI < . . . < x; and 81 < . . . < 8" (33) s, =IPe,(xd Pe.(xI) Pe,(x,,) Pe (x ) I ° . "
26. For a random variable X with binomial distribution b(p, n), determine the constants Cj , 't, (i = 1,2) in the UMP test (24) for testing H : P s .2 or s .7 when a = .1 and n = 15. Find the power of the test against the alternative P =.4.
25. Let FI , ••• , F", + I be real-valued functions defined over a space U. A sufficient condition for Uo to maximize Fm +1 subject to F;(u) C; (i = 1, ... , m) is that it satisfies these side conditions, that it maximizes F",+I(U) - EkiF;(u) for some constants k, 0, and that F;(uo) = C; for
24. The following example shows that Corollary 4 does not extend to a countably infinite family of distributions. Let Pn be the uniform probability density on [0, 1 + lin], and Po the uniform density on (0,1). (i) Then Po is linearly independent of (PI' P2'. . . ), that is, there do not exist
23. Optimum selection procedures. On each member of a population n measurements (XI" . . , X,,) = X are taken, for example the scores of n aptitude tests which are administered to judge the qualifications of candidates for a certain training program. A future measurement Y such as the score in a
22. If fJ ((J) denotes the power function of the UMP test of Corollary 2, and if the function Q of (12) is differentiable, then fJ'((J) > 0 for all (J for which Q'«(J) > O. [To show that {J'«(Jo) > 0, consider the problem of maximizing, subject to Eoo (X) =a, the derivative fJ'((Jo) or
21. Confidence bounds withminimumrisk. Let L«(J,~) be nonnegative and nonincreasing in its second argument for < (J , and equal to 0 for (J . If and ~* are two lower confidence bounds for (J such that Po{~ s (J'} s Po{~* s (J '} for all (J ' s (J , then EoL«(J,~) s EoL«(J,!!*). [Define two
20. (i) For n = 5,10 and 1 - a = .95, graph the upper confidence limits p and p* of Example 7 as functions of t = x + u. (ii) For the same values of n and a l = a2 = .05, graph the lower and upper confidence limits p and p. 3.10]
19. In the experiment discussed in Example 5, n binomial trials with probability of success P = 1 - -~ are performed for the purpose of testing A = Ao against A = AI ' Experiments corresponding to two different values of v are not comparable.
18. For the 2 X 2 table described in Example 4, and under the assumption P s 'TT s ! made there, a sample from iJ is more informative than one from A. On the other hand, samples from B and iJ are not comparable. [A necessary and sufficient condition for comparability is given in the preceding
17. Conditions for comparability. (i) Let X and X' be two random variables taking on the values 1 and 0, and suppose that P(X-1) Po, P(X-1) P or that P(X = 1} = P P(X 1) p. Without loss of generality let Po Po Po P. Po
16. If the experiment (f, g) is more informative than (f', g'), then (g,f) is more informative than (g', f').
15. If FQ , FI are two cumulative distribution functions on the real line, then FI(x) s FQ(x) for all x if and only if Eo1/J(X) s EI1/J(X) for any nondecreasing function 1/J.
14. Extension of Lemma 2. Let Po and P, be two distributions with densities Po, P such that p(x)/Po(x) is a nondecreasing function of a real-valued statistic T(x). (i) If 7 has probability density p; when the original distribution is P,, then P(1)/Po(1) is nondecreasing in 1. (ii) E(T) E(7) for any
13. Let X be a single observation from the Cauchy density given at the end of Section 3. (i) Show that no UMP test exists for testing = 0 against > 0. (ii) Determine the totality of different shapes the MP level-a rejection region for testing against = 0, can take on for varying a and - 00
12. Let X (X., X) be a sample from the uniform distribution U(0,0 + 1). (i) For testing H:00, against K: 0> 0 at level a there exists a UMP test which rejects when min(X,,..., X) > + C(a) or max(XX) + 1 for suitable C(a). (ii) The family U(0, 0+ 1) does not have monotone likelihood ratio. [Ad-
11. When a Poisson process with rate is observed for a time interval of length 'T, the number X of events occurring has the Poisson distribution P(~'T) Under an alternative scheme, the process is observed until r events have occurred, and the time T of observation is then a random variable such
10. Let Xl' . .. ' Xn be independently distributed with density (2fJ)-le- x / 2I1 , x 0, and let YI ;5; •• • ;5; y" be the ordered X's. Assume that YI becomes available first, then Y2 , and so on, and that observation is continued until Y,. has been observed. On the basis of YI , ... , Y,. it
9. Let the probability density PII of X have monotone likelihood ratio in T(x), and consider the problem of testing H : 0 ;5; 00 against 0 > fJo. If the distribution of T is continuous, the p-value Ii of the UMP test is given by Ii = Pllo{ T r}, where t is the observed value of T. This holds also
8. (i) A necessary and sufficient condition for densities PII(x) to have monotone likelihood ratio in x, if the mixed second derivative a2IogplI(x)/ao ax exists, is that this derivative is 0 for all 0 and x. (ii) An equivalent condition is that a2PII(X) aplI(x) aplI(x) PII(X) aoax ao ax
7. Let X be the number of successes in n independent trials with probability P of success, and let l/l(x) be the UMP test (9) for testing P s Po against P > Po at level of significancea. (i) For n = 6, Po = .25 and the levels a = .05, .1, .2 determine C and y, and find the power of the test against
6. Fully informative statistics. A statistic T is fully informative if for every decision problem the decision procedures based only on T form an essentially complete class. If 9 is dominated and T is fully informative, then T is sufficient. [Consider any pair of distributions Po, PI E 9 with
5. If the sample space !l is Euclidean and Po, PI have densities with respect to Lebesgue measure, there exists a nonrandomized most powerful test for testing Po against PI at every significance level a.· [This is a consequence of Theorem 1 and the following lemma.' Let f 0 and fAf(x) dx =a. Given
4. The following example shows that the power of a test can sometimes be increased by selecting a random rather than a fixed sample size even when the randomization does not depend on the observations. Let XI , . . . .X, be independently distributed as N((), 1), and consider the problem of testing
3. UMP test for exponential densities. Let XI' ... ' Xn be a sample from the exponential distribution E(a, b) of Chapter 1, Problem 18, and let ,\(1) = min( XI, .. . , Xn ) . (i) Determine the UMP test for testing H: a = ao against K : a :I: ao when b is assumed known. (ii) The power of any MP
2. UMP test for U(0, 8). Let X = (Xl" .. , X;,) be a sample from the uniform distribution on (0, 8). (i) For testing H : 8 80 against K : 8 > 80 any test is UMP at level a for which Esoep(X) =a, Esep(X) a for 8 80 , and ep(x) = 1 when max(xl' " '' x,,) > 80 , (ii) For testing H: 8 = 80 against K :
1. Let Xl' . . . • X" be a sample from the normal distribution N(t (1 2 ). (i) If 11 = 110 (known), there exists a UMP test for testing H: ~o against > ~o, which rejects when 1:(X; - ~o) is too large. (ii) If = ~o (known), there exists a UMP test for testing H: 11 110 against K : 11 > 110 , which
16. Let ° be the natural parameter space of the exponential family (35), and for any fixed tr + 1, ... , tk (r < k) let 0e,.....9, be the natural parameter space of the family of conditional distributions given 1',.+ 1 = tr+ l' ... , Tk = tk : (i) Then 0e,.....9, contains the projection °91•.
15. For any 8 which is an interior point of the natural parameter space, the expectations and covariances of the statistics in the exponential family (35) are given by E[~(X)] alog C(8) a~ (j=I, ... ,k), E[T;(X)~(X)] - [ET;(X)E~(X)] = _ a 2 10g C( 8) ----j VVj (i ,j=I,. . . , k ).
14. Life testing. Let XI" ' " XII be independently distributed with exponential density (20) -le- x / 28 for x 0, and let the ordered X's be denoted by Y1 :::; Y2 :::; • • • :::; It is assumed that YI becomes available first, then Y2 , and so on, and that observation is continued until has
13. Let Xi (i = 1, .. . , s) be independently distributed with Poisson distribution P(A;), and let To = LXj, T; = Xi' A = LAj . Then To has the Poisson distribution P( A), and the conditional distribution of TI , . . . , 1;-I given To = to is the multinomial distribution (34) with n = to and p; =
12. For a decision problem with a finite number of decisions, the class of procedures depending on a sufficient statistic T only is essentially complete. [For Euclidean sample spaces this follows from Theorem 4 without any restriction on the decision space. For the present case, let a decision
11. If a statistic T is sufficient for 9, then for every function I which is (d, Pu)-integrable for all 8 e n there exists a determination of the conditional expectation function Eu[f(X)lt] that is independent of 8. [If !I" is Euclidean, this follows from Theorems 5 and 7. In general, if I is
10. Pairwise sufficiency. A statistic T is pairwise sufficientfor 9 if it is sufficient for every pair of distributions in 9 . (i) If 9 is countable and T is pairwise sufficient for 9, then T is sufficient for 9 . (ii) If 9 is a dominated family and T is pairwise sufficient for 9, then T is
9. Sufficiency of likelihood ratios. Let Po, PI be two distributions with densities Po'Pi - Then T(x) = PI(x )Ipo(x) is sufficient for 9 = {Po' PI}' [This follows from the factorization criterion by writing PI = T · Po' Po = 1 . Po .)
8. Symmetric distributions. (i) Let 9 be any family of distributions of X = (XI " .. , X,,) which are symmetric in the sense that p{ (.\';" ... ,.\';J E A} = P{ (XI' " '' X,,) E A} for all Borel sets A and all permutations (il" .. , i,,) of (1, . . . , n). Then the statistic T of Example 7 is
7. Let!l' = '!!Ix.r, and suppose that Po, PI are two probability distributions given by dPo(y , t) = f(y)g(t) dp.(y) dv(t), dPI(y , r) = h(y , r) dp.(y) dv(t) , where h(y, t)lf(y)g(t) < 00 . Then under PI the probability density of Y with respect to p. is [ h(y, T) Iy - y] pi(y) =f(y)Eo f(y)g(T) -
6. Prove Theorem 4 for the case of an n-dimensional sample space. [The condition that the cumulative distribution function is nondecreasing is replaced by P{x,continuous on the right can be stated as limm_ooF(xl + 11m , . . . , x; + 11m) = F(XI' '' ' 'XII ) , )
5. (i) Let be any family of distributions X = (X...., X) such that P{(X, XXX,, X,) A} = P{(x,..., X) = A} for all Borel sets A and all i=1,..., n. For any sample point (x1,...,x,,) define (y)(x, xxxx,-1), where x = x(1) min(xx). Then the conditional expectation of f(x) given Y = y is ( ) = [(3)
4. Let (~ Jaf) be a measurable space, and Jafo a a-field contained in Jaf. Suppose that for any function T, the a-field gj is taken as the totality of sets B such that T - 1(B) E Jaf. Then it is not necessarily true that there exists a function T such that T-1(gj) = Ji'o. [An example is furnished
3. If f(x) > 0 for all xES and J1. is a-finite, then fsfdJ1. = 0 implies J1.(S) = o. [Let SII be the subset of S on which f(x) l in. Then J1.(S) .s LJ1.(S,,) and J1.(SII) nfsJdJ1. s nfsfdJ1. = 0.]
2. Radon-Nikodym derivatives. (i) If X and J1. are a-finite measures over (~, Jaf) and J1. is absolutely continuous with respect to X, then ffdJ1. = ff:~ dX for any J1.-integrable function f . (ii) If X, J1., and I' are a-finite measures over (!!l", Jaf) such that I' is absolutely continuous with
1. Monotone class. A class !F of subsets of a space is a field if it contains the whole space and is closed under complementation and under finite unions; a class vi( is monotone if the union and intersection of every increasing and decreasing sequence of sets of vi( is again in vI(. The smallest
19. A statistic T satisfying (17}-(19) is sufficient if and only if it satisfies (20).
18. (i) Let Xl' . . . ' Xn be a sample from the uniform distribution V(O, 8), o< 8 < 00, and let T = max(Xl' ... , Xn ) . Show that T is sufficient, once by using the definition of sufficiency and once by using the factorization criterion and assuming the existence of statistics 1'; satisfying
17. In n independent trials with constant probability p of success, let X; = 1 or 0 as the i th trial is a success or not. Then E7-1 X; is minimal sufficient. [Let T = EX; and suppose that V = f(T) is sufficient and that f(k l ) = ... = f(k r ) = u. Then P{T = tlV = u} depends on p.]
16. (i) Let X take on the values 8 - 1 and 8 + 1 with probability t each. The problem of estimating 8 with loss function L(8,d) = min(18 - dl,l) remains invariant under the transformation gX = X +c, g8 = 8 +c, g*d = d +c. Among invariant estimates, those taking on the values X-I and X + 1 with
15. Admissibility of invariant procedures. If a decision problem remains invariant under a finite group, and if there exists a procedure 80 that uniformly minimizes the risk among all invariant procedures, then 80 is admissible. [This follows from the identity R(8, 8) = R(g8, g*8g- 1) and the hint
14. Admissibility of unbiased procedures. (i) Under the assumptions of Problem 10, if among the unbiased procedures there exists one with uniformly minimum risk, it is admissible. (ii) That in general an unbiased procedure with uniformly minimum risk need not be admissible is seen by the following
13. (i) Let XI" ' " Xn be a sample from N( t ( 2 ), and consider the problem of deciding between Wo: < 0 and WI : O. If x = Ex;/n and C = (a l /ao) 2/n, the likelihood-ratio procedure takes decision do or dl as 1.10] {nx < k or > k , VE( x, - X)2 where k = - if C> 1 and k = ";(1 - C)/C if C < 1.
12. (i) Let X have probability density plJ(x) with 8 one of the values 81" " , 8n , and consider the problem of determining the correct value of 8, so that the choice lies between the n decisions dl = 81 " , . , d; = 8n with gain a(8;) ifd, = 8; and 0 otherwise. Then the Bayes solution (which
11. Invariance and minimax. Let a problem remain invariant relative to the groups G, G, and G* over the spaces !!C, 0 , and D respectively. Then a randomized procedure y. is defined to be invariant if for all x and g the conditional distribution of Yx given x is the same as that of g* -IYgx- (i)
10. Unbiasedness and minimax. Let °= 00 U °1 where 00' °1 are mutually exclusive, and consider a two-decision problem with loss function L(8, d;) = a; for 8 E OJ (j '" i) and L(8, d;) = 0 for 8 E 0; (i = 0,1) . (i) Any minimax procedure is unbiased. (ii) The converse of (i) holds provided Pe(A)
9. (i) As an example in which randomization reduces the maximum risk, suppose that a coin is known to be either standard (HT) or to have heads on both sides (HH). The nature of the coin is to be decided on the basis of a single toss, the loss being 1 for an incorrect decision and 0 for a correct
8. Structure of Bayes solutions. (i) Let be an unobservable random quantity with probability density p(0), and let the probability density of X be p(x) when = 0. Then 8 is a Bayes solution of a given decision problem if for each x the decision 8(x) is chosen so as to minimize (L(0, 8(x))(0|x) do,
7. Unbiasedness in interval estimation . Confidence intervals 1= (1:, I) are unbiased for estimating ° with loss function L(O, /) = (0 -1:)2 + (I - 0)2 provided E[t(1: + I)] = ° for all 0, that is, provided the midpoint of I is an unbiased estimate of ° in the sense of (11). t Here and in
Showing 2300 - 2400
of 4976
First
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
Last
Step by Step Answers