Support |
At 10:57 AM 7/24/2003, Nic Roozeboom wrote: >>>In my understanding, one only refers to latency when it involves a time >>>interval during which there is uncertainty. >> >>no, that's wrong. what you are referring to is jitter, as Chris explained > >No, I was referring to latency (not jitter) as offset from delay. I >didn't >throw jitter into the discussion, I was making a distinction between >latency and delay. Granted both jitter and latency are concepts involving >timing uncertainty, however fundamentally different. Delay is far more >simple, and the measurements under discussion were all about delay (not >latency, not jitter). But, it's all just a matter of convention - if you >have no use for the conventional distinction in terminology, by all means >disregard it. Doing so however would not garner much credibility in an >engineering context though. Sorry if I got you all bent out of shape there somehow - wasn't intended. The engineering definition of latency I've always heard is as a synonym for delay. Do a google search for "latency definition" and you will see it defined that way many times. "The length of time it takes to respond to an event". That's also how I've always heard the word used by my fellow engineers. Where is it defined otherwise? Sometimes I hear it qualified with some adjective like "unpredictable latency" or "maximum latency", which is really the same as saying "unpredictable delay" or "maximum delay". The adjective implies the uncertainty, not the noun. Jitter is defined as the uncertainty of when an event occurs around the time when it is expected to occur. It is often expressed as an absolute number, although more correctly it should be expressed as probability. I'm not sure what the point of this is. Mark is caught in a time warp and seeing apes and we're worried about shades of meaning in the word "latency". kim ______________________________________________________________________ Kim Flint | Looper's Delight kflint@loopers-delight.com | http://www.loopers-delight.com