An image-encoding algorithm, when used to encode images of a certain size, uses a mean of 110

Question:

An image-encoding algorithm, when used to encode images of a certain size, uses a mean of 110 milliseconds with a standard deviation of 15 milliseconds. What is the probability that the mean time (in milliseconds) for encoding 50 randomly selected images of this size will be between 90 and 135 milliseconds? What assumptions do we need to make?
Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Mathematical Statistics With Applications In R

ISBN: 9780124171138

2nd Edition

Authors: Chris P. Tsokos, K.M. Ramachandran

Question Posted: