Question: ST107 Exercise 7 In this exercise you will practise point estimation by investigating the properties of different estimators. Question 1 involves the estimation of the

ST107 Exercise 7 In this exercise you will practise point estimation by investigating the properties of different estimators. Question 1 involves the estimation of the probability of success, , based on two independent binomial random variables (note these are not identically distributed due to the different numbers of trials in X and Y , i.e. 6 and 8, respectively). You will need to use the result about the variance of a linear combination of independent random variables, as well as the definition of conditional probability. Question 2 requires you to choose between competing estimators of , for which we seek the estimator with the smallest mean squared error (MSE). Question 3 involves biased estimators, while finally Question 4 is a standard (but useful!) proof. 1. Let X be a random variable with a binomial distribution with parameters 6 and , and let Y be a random variable with a binomial distribution with parameters 8 and . X and Y are independent. (a) Consider the three different estimators X/6, Y /8 and (X + Y )/14. Show that they are all unbiased estimators of . (b) Which of the three estimators in (a) would you prefer and why? (c) Suppose now that = 0.5. Calculate the conditional probability: P (X = 1 | X + Y = 2). 2. Let T1 and T2 be two unbiased estimators of the parameter . T1 and T2 have the same variance and are independent of each other. Consider the following estimators: S= T1 + T2 2 and R= 2T1 + T2 3 (a) Are S and R both unbiased estimators of ? (b) Which one of the three estimators T1 , S and R is the best estimator of , and which one is the worst? Explain your answer. 3. A random variable X can take the values 1, 0 and 1. We know that: P (X = 1) = 6 , 10 P (X = 0) = 10 and P (X = 1) = 3 . 10 One observation is taken and we want to estimate . Consider the estimators: T1 = X and 1 T2 = X(X 1). (a) Determine the bias of T1 and T2 as estimators of . (b) Derive the mean squared error of T1 . 4. The MSE of an estimator is the average squared error, defined as: \u0012\u0010 \u00112 \u0013 b = E b MSE() . Show how this can be decomposed into variance and bias components such that: \u0010 \u00112 b = Var() b + Bias() b . MSE() 2

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!