# Question: The mean time taken to learn the basics of a

The mean time taken to learn the basics of a software program by all students is 200 minutes with a standard deviation of 20 minutes.

a. Using Chebyshev’s theorem, find at least what percentage of students will learn the basics of this software program in

i. 160 to 240 minutes

ii. 140 to 260 minutes

b. Using Chebyshev’s theorem, find the interval that contains the times taken by at least 75% of all students to learn this software program.

a. Using Chebyshev’s theorem, find at least what percentage of students will learn the basics of this software program in

i. 160 to 240 minutes

ii. 140 to 260 minutes

b. Using Chebyshev’s theorem, find the interval that contains the times taken by at least 75% of all students to learn this software program.

## Answer to relevant Questions

According to the American Time Use Survey conducted by the Bureau of Labor Statistics (www.bls.gov/atus/), Americans spent an average of 985.50 hours watching television in 2010. Suppose that the standard deviation of the ...Refer to the data given in Exercise 3.111 on the total points scored by each of the top 16 NBA scorers during the 2010–11 regular seasons. a. Calculate the values of the three quartiles and the interquartile range. Where ...A small country bought oil from three different sources in one week, as shown in the following table. Find the mean price per barrel for all 1300 barrels of oil purchased in that week. The test scores for a very large statistics class have a bell-shaped distribution with a mean of 70 points. a. If 16% of all students in the class scored above 85, what is the standard deviation of the scores? b. If 95% of ...The following stem-and-leaf diagram gives the distances (in thousands of miles) driven during the past year by a sample of drivers in a city. a. Compute the sample mean, median, and mode for the data on distances driven. b. ...Post your question