An experiment was carried out by geologists to see how the time necessary to drill a distance

Question:

An experiment was carried out by geologists to see how the time necessary to drill a distance of 5 feet in rock ( y, in minutes) depended on the depth at which the drilling began (x, in feet, between 0 and 400). We show part of the Minitab output obtained from fitting the simple linear regression model (€œMining Information,€
American Statistician [1991]: 4€“ 9).
An experiment was carried out by geologists to see how

a. What proportion of observed variation in time can be explained by the simple linear regression model?
b. Does the simple linear regression model appear to be useful?
c. Minitab reported that sa1b12002 5 .347. Calculate a
95% confidence interval for the mean time when depth 5 200 feet.
d. A single observation on time is to be made when drilling starts at a depth of 200 feet. Use a 95% prediction interval to predict the resulting value of time.
e. Minitab gave (8.147, 10.065) as a 95% confidence interval for mean time when depth = 300. Calculate a 99% confidence interval for this mean.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: