Dr. AIX's POSTS TECH TALK — 02 October 2013

By

There a number of things that can affect the quality of a digitally recorded piece of music but one of the most important of these is jitter. Simply put, jitter is an error that happens when the timings of individual samples is different during the encoding and decoding processes. The amount of sonic degradation is sometimes severe but most of the times it is a slight loss of image depth and spatial accuracy. Let’s take a close look at jitter.

Analog to digital conversion involves slicing time into very short but equal length slices. A clock determines the rate of the slicing. Sometimes the clock is built into the analog to digital converter but it doesn’t have to be. In fact, professional facilities like my own don’t use the internal clocks supplied by the makers of the ADCs but instead have a highly accurate external clock that sends its tick tock signals to ALL of the digital equipment in the studio. In my facility, we have something called an Isochrone 10 M “Atomic Clock” made by Antelope Audio, which is a very expensive Rubidium atomic reference generator. This device is accurate to .03 parts per billion. Clocks don’t get much better than this. Most clocks use crystals to establish their rate, the atomic clock uses rubidium. The makers of this piece of equipment claim: “A Rubidium atomic reference generator 100,000 times more accurate than the crystal oscillators, bringing vastly improved staging, transparency and imaging.”

jitter_master_clock

Figure 1 – A rubidium atomic clock.

It is the job of the master clock to establish the sample rate. Typical sample rates include 44.056, 44.1, 48, 88.2, 96, 176.4 and 192 kHz (I included the first one because it was among the first to be used for audio. It is actually the native rate of a video deck but was adopted for audio when digital audio recorders were first introduced via the Sony F1/601 processor). The master clock doesn’t actually run at these rates. It uses a very high (in the MHz range) rate that is then divided down to the relatively low rates that we use in audio.

During a PCM digital recording the amplitude of each sample is determined (within the discrete amplitude values supplied by the word lengths) and stored at each time slice. When it’s time to reconstruct the signal, each amplitude or sample is passed to a system that aggregates the various voltages to be output at each of the time slices. If the times slices or samples aren’t exactly the same as they were during the capture of the sound, then there is a difference that results in jitter errors. See figure 2 below to see the difference between the input and output signals IF there is excessive jitter present. The regular spacing of the source sampling rate is somewhat inconsistent on the top graphic.

jitter_graphic

Figure 2 – The bottom trace is the ADC process with a perfect clock source. The top trace is the DAC process with clock distortions. The middle is the comparison of the digital information…notice the input and output do not match resulting a slightly different analog waveshape output. (Click to enlarge)

So just how crazy do you want to be in the establishment of accurate clocking when you play back the digital files that come from an optical disc or from a download service? As accurate as you can afford. The internal clocks of reference players with high quality DACs in them do a very good job. They often reclock the signals that come from the source files with an internal clock of their own. The result is that jitter is a known problem that has essentially been eliminated in reasonable quality gear.

If I make a great sounding digital recording and you use care in playing it back, you won’t notice the “loss of transparency or imaging”.

Forward this post to a friend and help us spread the word about HD-Audio Forward this post to a friend and help us spread the word about HD-Audio

Share

About Author

Dr. AIX

Mark Waldrep, aka Dr. AIX, has been producing and engineering music for over 40 years. He learned electronics as a teenager from his HAM radio father while learning to play the guitar. Mark received the first doctorate in music composition from UCLA in 1986 for a "binaural" electronic music composition. Other advanced degrees include an MS in computer science, an MFA/MA in music, BM in music and a BA in art. As an engineer and producer, Mark has worked on projects for the Rolling Stones, 311, Tool, KISS, Blink 182, Blues Traveler, Britney Spears, the San Francisco Symphony, The Dover Quartet, Willie Nelson, Paul Williams, The Allman Brothers, Bad Company and many more. Dr. Waldrep has been an innovator when it comes to multimedia and music. He created the first enhanced CDs in the 90s, the first DVD-Videos released in the U.S., the first web-connected DVD, the first DVD-Audio title, the first music Blu-ray disc and the first 3D Music Album. Additionally, he launched the first High Definition Music Download site in 2007 called iTrax.com. A frequency speaker at audio events, author of numerous articles, Dr. Waldrep is currently writing a book on the production and reproduction of high-end music called, "High-End Audio: A Practical Guide to Production and Playback". The book should be completed in the fall of 2013.

(4) Readers Comments

  1. External clocks often degrade sound. I’d imagine them unnecessary for any small setup, particularly if the system is only for playback. The best bet is to use gear with solid clocks.
    http://www.soundonsound.com/sos/jun10/articles/masterclocks.htm

  2. I should rephrase that to “often have the capability to degrade sound.” People design their equipment to work with the clock inside.

  3. Simón…for consumer systems of a certain level, it’s true that the clocks have improved substantially and having a reference quality clock will being marginal improvements. However, it a professional facility where i have numerous digital devices (including my all digital console) that must be clocked by the same master clock having a great reference clock is a requirement.

    I don’t by the “degrade” statement however. If I clock my ADC and DAC through external means rather than be daisy chaining the internal clock from the ADC through the rest of the chain, both will perform very well.

    • I agree that a professional facility benefits from a master clock (I work in digital audio recording myself) but for people interested in two-channel home playback (which as far as I can tell is the target audience for this blog) going from DAC –> pre –> amp it does not seem terribly relevant. However, upon rereading I see that you did give some emphasis on you using the clock on the source end rather than the consumer needing a clock. People are of course free to experiment if they have the time and money to mess with it but I’d think people are much better served sinking that several thousand dollars into speakers, amps and DACs.

Leave a Reply

Your email address will not be published. Required fields are marked *