<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD>
<META http-equiv=Content-Type content="text/html; charset=gb2312">
<META content="MSHTML 6.00.2900.2604" name=GENERATOR>
<STYLE></STYLE>
</HEAD>
<BODY bgColor=#ffffff>
<DIV><FONT face=Arial size=2>Hi all,</FONT></DIV>
<DIV><FONT face=Arial size=2></FONT> </DIV>
<DIV><FONT face=Arial size=2>it seems to be a common understanding that if a TCP
flow experiences</FONT></DIV>
<DIV><FONT face=Arial size=2>10% or more packet loss, the flow stops (i.e.,
attains 0 or meaningless throughput)</FONT></DIV>
<DIV><FONT face=Arial size=2></FONT> </DIV>
<DIV><FONT face=Arial size=2>ns-2 simulations also seem to agree with this
observation.</FONT></DIV>
<DIV><FONT face=Arial size=2></FONT> </DIV>
<DIV><FONT face=Arial size=2>my questions is what is the theoretical or
analytical explanation for this observation?</FONT></DIV>
<DIV> </DIV>
<DIV><FONT face=Arial size=2>thanks in advance.</FONT></DIV>
<DIV><FONT face=Arial size=2>--roy</FONT></DIV>
<DIV> </DIV></BODY></HTML>