Question: = Suppose that there is a server that contains a 4GB ultra high definition video (recall that 1 GB = 220 bytes=223 bits =

= Suppose that there is a server that contains a 4GB ultra high definition video (recall that 1 GB = 220 bytes=223 bits = 8.59 x 10 bits. Suppose that the server has an upload rate of 100 Mbps 1x108 bits/second. The server is connected to 100 machines, each of which can support an upload rate of 10 Mbps = 1x107 bits/second. Assume that the download speeds of all machines is at least 500 Mbps so the bottle-neck is always in uploads. Further assume that any one machine can be connected to exactly one other machine at the same time. Network congestion and propagation delay is assumed to be negligible. a. Assume that we wish to distribute the video from the server to all 100 machines using a client-server architecture, how long will it take for all machines to fully receive the video? b. Describe a peer-to-peer approach for distributing this video to all 100 machines? How long will it take for all machines to fully receive the video?
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
