Question: Suppose that when a TCP segment is sent more than once, we take SampleRTT to be the time between the original transmission and the ACK,
Suppose that when a TCP segment is sent more than once, we take SampleRTT to be the time between the original transmission and the ACK, as in Figure 5.10(a). Show that if a connection with a one-packet window loses every other packet (i.e., each packet is transmitted twice), then EstimatedRTT increases to infinity. Assume TimeOut =
EstimatedRTT; both algorithms presented in the text always set TimeOut even larger. Hint: EstimatedRTT = EstimatedRTT+β×(SampleRTT−
EstimatedRTT).
Step by Step Solution
3.32 Rating (161 Votes )
There are 3 Steps involved in it
The question is related to TCP Transmission Control Protocol RTT Round Trip Time estimation A pertinent explanation will require some understanding of ... View full answer
Get step-by-step solutions from verified subject matter experts
