# Question

Suppose that we want to estimate the parameter θ of the geometric distribution on the basis of a single observation. If the loss function is given by

And Θ is looked upon as a random variable having the uniform density h(θ) = 1 for 0 < θ < 1 and h(θ) = 0 else-where, duplicate the steps in Example 9.9 to show that

(a) The conditional density of given X = x is

(b) The Bayes risk is minimized by the decision function d(x) = 2 / x + 2

And Θ is looked upon as a random variable having the uniform density h(θ) = 1 for 0 < θ < 1 and h(θ) = 0 else-where, duplicate the steps in Example 9.9 to show that

(a) The conditional density of given X = x is

(b) The Bayes risk is minimized by the decision function d(x) = 2 / x + 2

## Answer to relevant Questions

If X1, X2, . . . , Xn constitute a random sample from a population with the mean µ, what condition must be imposed on the constants a1, a2, . . . , an so that a1X1 + a2X2 + · · · + anXn is an unbiased estimator of µ? Show that the mean of a random sample of size n is a minimum variance unbiased estimator of the parameter λ of a Poisson population. With reference to Exercise 10.21, find the efficiency of the estimator of part (a) with ω = 1/2 relative to this estimator with In exercise If 1 is the mean of a random sample of size n from a normal population with the ...Use the formula for the sampling distribution of 8 X on page 253 to show that for random samples of size n = 3 the median is an unbiased estimator of the parameter θ of a uniform population with α = θ – 12 and β = θ + ...Substituting “asymptotically unbiased” for “ unbiased” in Theorem 10.3, use this theorem to rework Exercise 10.35.Post your question

0