[e2e] trading acks...TRACKS
Detlef Bosau
detlef.bosau at web.de
Mon Dec 4 05:39:52 PST 2006
O.k., let´s carry cowls to Newcastle :-)
(I know, I better should arrange for a trip to the north pole for the
next six weeks after sending this post because it´s so stupid.)
(BTW: Kind regards from "Rockin´ Rudy", the little McDonald´s reindeer,
I bought some years ago.)
L.Wood at surrey.ac.uk wrote:
>
> If a packet can't enter the network until one has left, how do
> you ever get started in an empty totally quiet network? Simple
> reductio ad absurdum suggests that the packet conservation
> principle as expressed below is bogus. Not so much isarithmic,
> as isacrock.
>
I personally compare this whole thing to energy as it is kept in a
dynamic system. So, the conservation principle basically means nothing
else then the energy in this system should be kept constant.
So, you have two issues here:
1. Keep the amount of engery constant => don´t add energy to the system
before the system has completed some work, i.e. energy has left the system.
2. The question is: How much energy can a system keep?
The second issue is addressed by a) probing which yields b) an estimator
for a path´s capacity, i.e. CWND.
So, you don´t have a "strong" isarithmic system: You can add workload
(=emergy) as long as it can be kept and is not dropped by some router.
However, it´s a problem to have exact system theoretical model of the
Internet or even a single TCP connection. And I even don´t really know
where this should be good for. Perhaps for some interesting calculus
calisthenics which are interesting for some papers or even some PhD theses.
But at least the models I know of are far too much away from a real
packet switching network to be really useful.
In my opinion, the most basic reasons for the Internet to work
acceptable are in fact 1. the conservation principle, which ensures
that the workload in the net is not increased in an "unreasonable" way,
but 2. there is some reasonable probing (basically the AIMD probing) and
particularly, if the path´s capacitor estimation turns to be to large,
it is decreased - and anything is fine.
So, the Internet works fine with no real congestion collapse. (The most
prominent oscillating system suffering from some special case of
congestion collapse is perhaps the Takoma bridge disaster
http://www.ketchum.org/bridgecollapse.html)
O.k., it´s not much wisdom in what I write here.
Mainly, I doubt these extremely sophisticated models.
Personally, I think mostly of the Takoma bridge - which would still be
there if only someone hat limited the energy ;-) - and Newton´s cradle
when I try to understand stability issues in the Internet. The latter is
particularly descriptive as the number of balls visualizes the workload.
You can imagine adding a ball as long there is room in the cradle or
taking away a ball, it´s funny :-)
Detlef
More information about the end2end-interest
mailing list