It was a very crowded room Tuesday evening in room A202 of La Corte Hall at the California State University at Dominguez Hills. I held my first meeting of the “Introduction to Audio Recording” for a packed house of 45 students (in a class that the administration capped at 25). After handling the normal chores of reading the roster and explaining the nature of the course, I launched into the first lecture. This is a class that only meets once a week so I have to pack a lot of information and keep the group somewhat entertained for almost three hours. I think last evening went pretty well. I thought I would share the nature of my first lecture online because it focused on acoustics and the relationship of the physics of sound and the world of music. We tend to focus on the post performance side of the music pipeline but it’s important to understand a little about the raw materials…the acoustic energy that comes from voices and instruments.
Acoustics is somewhat broader than just the science of sound. It deals with vibration, sound, and both infra and ultrasound. The term acoustics is derived from the Greek word “akoustikos”, which means “of or for hearing, ready to hear” but it extends both higher and lower than human hearing.
Musical instruments, voices, mechanical devices and other natural phenomenon produce oscillations or periodic motions that can excite out ears resulting in electrical impulses to our brains. I have a great computer animation on “Auditory Transduction” produced by Brandon Pietsch on the anatomy of our ears and the process of hearing. I’m not sure if it’s available online but it is very well done.
Basic acoustics deals with three primary areas. These are:
Frequency – is the rate or the number of times per second at which a signal or event happens. Frequencies are bounded at DC or direct current on the low end and extend up through the audible spectrum into infinity…and beyond! The number of oscillations or cycles per second (cps) is named after a German physicist Heinrich Hertz (1857–1894) who made important contributions to the study of electromagnetism in the 19th century. If the frequency of a signal is in the range of 20-20 kilohertz (kHz), humans hear this as sound. As the frequency increases beyond our hearing range it manifests itself in other ways…including light and radio signals (see illustration below):
Figure 1 – A graphic showing the frequency ranges from DC to infrared. [Click to enlarge]
The speed of sound at sea level and 70 degrees Fahrenheit is about 1125 feet per second.
Amplitude – is the acoustics term used to describe the volume or loudness of a sound and is measured in the proportional unit a decibel. A dB expresses the relationship between two different amplitudes. When we say one sound is 3 dB louder than another sound it means the intensity of the sound pressure is twice what it was previously. The decibel scale is not linear. It is logarithmic.
The range of sound amplitudes that can be measure extends from 0 dB of sound pressure (which doesn’t exist except in anechoic chambers) to around 130 dB, which is a factor of around 100,000,000,000,000 times (100 trillion times)! Our ears are amazing instruments.
Figure 2 – The relationship between common levels of sound and their dB equivalent. [Click to enlarge]
I’ll get to the third item Harmonic Spectrum tomorrow…