[e2e] Reacting to corruption based loss
Craig Partridge
craig at aland.bbn.com
Sat Jun 25 17:13:18 PDT 2005
In message <42BDDD74.BF9FDB92 at web.de>, Detlef Bosau writes:
>Basically, we´re talking about the old loss differentiation debate. If
>packet loss is due to corruption, e.g. on a wireless link, there is not
>mecessarily a need for the sender to decrease it´s rate. Perhaps one
>could do some FEC or use robust codecs, depending on the application in
>use. But I do not see a reason for a sender to decrease it´s rate
>anyway.
I believe that's the general wisdom. Though I'm not sure anyone has
studied whether burst losses might cause synchronized retransmissions.
>In my opinion, and I´m willing to receive contradiction on this point,
>it is a matter of the Internet system model. Why couldn´t we continue
>to assume loss free links? Is this really a violation of the End to End
>Principle when we introduce link layer recovery? Or is it simply a well
>done seperation of concerns to fix link issues at the link layer and to
>leave transport issues to the transport layer?
Take a peek at Reiner Ludwig's (RWTH Aachen) dissertation which says that,
in the extreme case, link layer recovery and end-to-end recovery don't mix --
and we know from the E2E principle that some E2E recovery must be
present. (I believe there's also work from Uppsala showing that you
need at least some link layer recovery or TCP performance is awful --
what this suggests is we're searching for a balance point).
Craig
More information about the end2end-interest
mailing list