# Question

A large discount chain compares the performance of its credit managers in Ohio and Illinois by comparing the mean dollar amounts owed by customers with delinquent charge accounts in these two states. Here a small mean dollar amount owed is desirable because it indicates that bad credit risks are not being extended large amounts of credit. Two independent, random samples of delinquent accounts are selected from the populations of delinquent accounts in Ohio and Illinois, respectively. The first sample, which consists of 10 randomly selected delinquent accounts in Ohio, gives a mean dollar amount of $ 524 with a standard deviation of $ 68. The second sample, which consists of 20 randomly selected delinquent accounts in Illinois, gives a mean dollar amount of $ 473 with a standard deviation of $ 22.

a. Set up the null and alternative hypotheses needed to test whether there is a difference between the population mean dollar amounts owed by customers with delinquent charge accounts in Ohio and Illinois.

b. Figure 10.6 gives the MINITAB output of using the unequal variances procedure to test the equality of mean dollar amounts owed by customers with delinquent charge accounts in Ohio and Illinois. Assuming that the normality assumption holds, test the hypotheses you set up in part a by setting a equal to .10, .05, .01, and .001. How much evidence is there that the mean dollar amounts owed in Ohio and Illinois differ?

c. Assuming that the normality assumption holds, calculate a 95 percent confidence interval for the difference between the mean dollar amounts owed in Ohio and Illinois. Based on this interval, do you think that these mean dollar amounts differ in a practically important way?

a. Set up the null and alternative hypotheses needed to test whether there is a difference between the population mean dollar amounts owed by customers with delinquent charge accounts in Ohio and Illinois.

b. Figure 10.6 gives the MINITAB output of using the unequal variances procedure to test the equality of mean dollar amounts owed by customers with delinquent charge accounts in Ohio and Illinois. Assuming that the normality assumption holds, test the hypotheses you set up in part a by setting a equal to .10, .05, .01, and .001. How much evidence is there that the mean dollar amounts owed in Ohio and Illinois differ?

c. Assuming that the normality assumption holds, calculate a 95 percent confidence interval for the difference between the mean dollar amounts owed in Ohio and Illinois. Based on this interval, do you think that these mean dollar amounts differ in a practically important way?

## Answer to relevant Questions

A loan officer compares the interest rates for 48-month fixed-rate auto loans and 48-month variable- rate auto loans. Two independent, random samples of auto loan rates are selected. A sample of eight 48-month fixed-rate ...Do students reduce study time in classes where they achieve a higher midterm score? In a Journal of Economic Education article (Winter 2005), Gregory Krohn and Catherine O’Connor studied student effort and performance in a ...In the book Essentials of Marketing Research, William R. Dillon, Thomas J. Madden, and Neil H. Firtle discuss a research proposal in which a telephone company wants to determine whether the appeal of a new security system ...A marketing research firm wishes to compare the prices charged by two supermarket chains—Miller’s and Albert’s. The research firm, using a standardized one-week shopping plan (grocery list) makes identical purchases at ...Use critical values to test the null hypothesis H0: μ1 – μ2 = 20 versus the alternative hypothesis H0: μ1 – μ2 ≠ 20 by setting a equal to .10, .05, .01, and .001. How much evidence is there that the difference ...Post your question

0