Investors commonly use the standard deviation of the monthly percentage return for a mutual fund as a

Question:

Investors commonly use the standard deviation of the monthly percentage return for a mutual fund as a measure of the risk for the fund; in such cases, a fund that has a larger standard deviation is considered more risky than a fund with a lower standard deviation. The standard deviation for the American Century Equity Growth fund and the standard deviation for the Fidelity Growth Discovery fund were recently reported to be 15.0% and 18.9%, respectively (The Top Mutual Funds, AAII, 2009). Assume that each of these standard deviations is based on a sample of 60 months of returns. Do the sample results support the conclusion that the Fidelity fund has a larger population variance than the American Century fund? Which fund is more risky?
Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Statistics For Business And Economics

ISBN: 9780538481649

11th Edition

Authors: David R. Anderson, Dennis J. Sweeney, Thomas A. Williams

Question Posted: