Support |
> From: noone@jbro.us [mailto:noone@jbro.us] > This is, in my opinion, the primary advantage hardware units have over > software setups right now (aside from perhaps durability/stability issues > to some extent). Personally I have never quite grasped why > cheaper, slower hardware units measure their I/O latency in picoseconds, > while our much more powerful computer systems have noticeable delays. Latency in laptop systems is undeniably real and greater than what you will experience with most dedicated hardware units (with the apparent exception of the RC-50 :-). I don't know of any hardware units that do analog/digital conversion that measure their latency in picoseconds, unless we're talking about 1,000,000,000 picoseconds. Latency is normally measured in milliseconds or microseconds. At a standard sampling rate of 44.1K we have .02268 milliseconds per sample. With a typical buffer size of 16 samples, that becomes about 1/3 millisecond of latency. Humans cannot reliably detect the difference between 1ms of latency and .3ms of latency when it is held constant. Few can detect the difference between 1ms and 5ms. 10ms seems to be the magic number where most people can start to sense that something is "off". I find that most of the time, people that complain about laptop latency haven't tuned it properly. On a modern laptop with ASIO drivers, you can expect to achieve latency on the order of 5 milliseconds and sometimes lower. This may bother some people but there are obviously many thousands of soft-synth and Guitar Rig users that either don't notice it, or can adapt to it. Apologies for being OT. Jeff