Suppose a remote disk server system is configured so that the disk on a server has an

Question:

Suppose a remote disk server system is configured so that the disk on a server has an average block transfer time of 8 milliseconds, while a local disk for a client has an average block transfer time of 88 milliseconds (this would be a very slow disk by today’s standards).

a. How fast must the throughput be on the network for the remote disk server to have an access time lower than that of a local disk?

b. What three network protocol parameters/factors would most affect this simple analysis? Why?

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: