An experiment was carried out by geologists to see how the time necessary to drill a distance of 5 feet in rock ( y, in minutes) depended on the depth at which the drilling began (x, in feet, between 0 and 400). We show part of the Minitab output obtained from fitting the simple linear regression model (“Mining Information,”
American Statistician [1991]: 4– 9).
a. What proportion of observed variation in time can be explained by the simple linear regression model?
b. Does the simple linear regression model appear to be useful?
c. Minitab reported that sa1b12002 5 .347. Calculate a
95% confidence interval for the mean time when depth 5 200 feet.
d. A single observation on time is to be made when drilling starts at a depth of 200 feet. Use a 95% prediction interval to predict the resulting value of time.
e. Minitab gave (8.147, 10.065) as a 95% confidence interval for mean time when depth = 300. Calculate a 99% confidence interval for this mean.

  • CreatedSeptember 19, 2015
  • Files Included
Post your question