A couple of months ago, I was sitting with my youngest son in Mexico City and we started talking about digital audio. Michael received his graduate degree from MIT (I just happen to be wearing my MIT T-Shirt today) and was finishing his year as a Fulbright Scholar there. About a month ago, he moved to Zurich, Switzerland to start his first full time job (he’s a thoroughly international guy…and will now have to master another language). Our conversation steered towards the Nyquist/Shannon Theorem as we discussed how and why digital audio works.
As often happens, he pulled out his Smartphone and looked up Claude Shannon on Wikipedia. I admit that I hadn’t investigated this amazing individual previously. But after hearing a few facts from the web article, I was astounded at the importance of his contributions to information theory…and where he grew up. Claude Shannon changed the course of the 20th century with his Masters Thesis and subsequent papers on “information theory”.
I’m from Michigan. I was born in Detroit and spend my youth in the nearby suburbs. During high school, my friends and I would head north to ski country almost every weekend. And the route took us up I-75 to the exit at Gaylord, Michigan…the city where Claude Shannon and his father and mother lived. He was born about 20 miles north of there in a beautiful little town called Petoskey, another connection to my skiing past (Boyne Highlands resort required you to pass through Petoskey). I was very pleasantly surprised.
Shannon is known as the “the father of information theory” after he published a landmark paper in 1948. As a 21-year old graduate student at the Massachusetts Institute of Technology (MIT), he wrote his thesis demonstrating that electrical applications of Boolean algebra could construct any logical, numerical relationship. The essence of his remarkable insight is that Boolean algebra could be realized through electrical applications of relays and switching circuits. This was the foundation of both the digital computer and digital circuit design theory.
The digital world in all of its incarnations owes a tremendous debt of gratitude to Claude Shannon. In reality, much of the high-tech gadgetry that we enjoy today stems from his work. A paper extracted from his thesis, “A Symbolic Analysis of Relay and Switching Circuits” earned Shannon the Alfred Noble American Institute of American Engineers Award in 1939. Howard Gardner called Shannon’s thesis “possibly the most important, and also the most famous, master’s thesis of the century.”
Not bad for a young man from remote North Country of Michigan.
The Nyquist/Shannon Theorem exists at the center of sampling theory and is a frequent topic in the great analog vs. digital debate in high-end audio conversations. Here’s the opening of the article on the theorem, “In the field of digital signal processing, the sampling theorem is a fundamental bridge between continuous-time signals (often called ‘analog signals’) and discrete-time signals (often called ‘digital signals’). It establishes a sufficient condition for a sample rate that permits a discrete sequence of samples to capture all the information from a continuous-time signal of finite bandwidth.
“The theorem also leads to a formula for perfectly reconstructing the original continuous-time function from the samples.”