Suppose that when a TCP segment is sent more than once, we take SampleRTT to be the

Question:

Suppose that when a TCP segment is sent more than once, we take SampleRTT to be the time between the original transmission and the ACK, as in Figure 5.10(a). Show that if a connection with a one-packet window loses every other packet (i.e., each packet is transmitted twice), then EstimatedRTT increases to infinity. Assume TimeOut =

EstimatedRTT; both algorithms presented in the text always set TimeOut even larger. Hint: EstimatedRTT = EstimatedRTT+β×(SampleRTT−

EstimatedRTT).

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  answer-question

Computer Networks A Systems Approach

ISBN: 9780128182000

6th Edition

Authors: Larry L. Peterson, Bruce S. Davie

Question Posted: