I've read many times about Linux's token bucket filter (tbf) and still I don't fully understand how I should calculate the burst
and latency
parameters, shame on me :(
I suppose that a reasonable latency is around 50 ms. OK, but what value should burst take?
The manpage says:
The latter calculation takes into account the size of the bucket, the rate and possibly the peakrate (if set). These two parameters are mutually exclusive.
So, how is latency related to bucket and filter? Is there a formula to calculate it? Or is it simply a matter of, "OK, X bytes of burst and Y seconds of latency is good for me"?