# Question

Suppose you want to calculate P(a < < b), where a and b are two numbers and x has a distribution with mean and standard deviation σ If a < µ < b (i.e., m lies in the interval a to b), what happens to the probability P(a < < b) as the sample size becomes larger?

## Answer to relevant Questions

Consider a large population with µ = 60 and σ = 10. Assuming n/N < .05, find the mean and standard deviation of the sample mean, , for a sample size of a. 18 b. 90 For a population, µ = 46 and σ = 10. a. For a sample selected from this population, and Find the sample size. µ = 46 and σ = 2.0. Assume n/N < .05. b. For a sample selected from this population, and Find the sample ...Consider the sampling distribution of given in Table 7.5. a. Calculate the value µ of using the formula µ = ∑P(). Is the value of µ calculated in Exercise 7.6 the same as the value of µ calculated here? b. ...Suppose that the incomes of all people in the United States who own hybrid (gas and electric) automobiles are normally distributed with a mean of $78,000 and a standard deviation of $8300. Let be the mean income of a ...Let x be a continuous random variable that has a distribution skewed to the right with µ = 60 and σ = 10. Assuming n/N < .05, find the probability that the sample mean, , for a random sample of 40 taken from this ...Post your question

0