Question: Consider the simple model for HTTP streaming shown below. Suppose the client application buffer is infinite, the server sends at the constant rate x ,
Consider the simple model for HTTP streaming shown below. Suppose the client application buffer is infinite, the server sends at the constant rate x, and the video consumption rate is r with r < x. Also suppose playback begins immediately. Suppose that the user terminates the video early at time t = E. At the time of termination, the server stops sending bits (if it hasnt already sent all the bits in the video).
Suppose the video is infinitely long. How many bits are wasted (that is, sent but not viewed)?
Suppose the video is T seconds long with T > E. How many bits are wasted (that is, sent but not viewed)?
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
