The fall semester starts next Monday, August 24, 2015. One of the courses I’ll be teaching is “Digital Media and The Arts”. It’s an introductory class intended to examine some of the major art forms and see how digital concepts and technology have influenced them. The basic idea is to see how music, for example, was practiced before digital technology, what happened when digital technology started being used in music making and production, and what creative possibilities did using digital technology make available that were impossible previously.
The course doesn’t get high marks on the graduating survey. I think the reason is that they don’t see the practicality of studying art forms other than music and they certainly don’t want to know anything about the development of counterpoint or early opera…these students are focused squarely on being recording engineers. Anything else is unnecessary fluff. I have to push hard against the administration to keep classes like this in the program. We are a university and not a vocational program. Learning about ancient Greece drama, the “music of the spheres” or how Guido’s “monochord” helped establish our western tuning system is not fluff, at least in my classes.
In order to understand how digital technology has influenced the arts, I have to introduce what digital is and how it works. Just because this generation of students uses digital devices every hour of every day, doesn’t mean that they understand what goes on behind the touch screen or video monitors.
Computers are made up a few basic components. There’s the brain or CPU (Central Processing Chip), digital storage (local or remote…fast or slow), a means of passing digital information around, and a system of input and output. That pretty much takes care of the hardware side of a computer system.
A computer doesn’t even require electricity. Building a mechanical computer is very doable. I remember a Waldrep Christmas in the early 60s when my older brother got a computer from Santa. This was at least 15 years before the first electronic personal computers were available. His calculating machine consisted of a whole lot of plastic parts that moved and shifted around when you cycled a handle on one end. If you wanted to do a calculation, you had to convert any input numbers to binary and then “mount” the input 1|0 values on small plastic extrusions, crank the handle, and then convert the resulting binary output back to decimal. Obviously, the process was very painstaking and difficult. But I was fascinated. It’s worked.
Of course, computing hardware can’t accomplish anything unless there is software written to make it work. And there are various levels of software from the lowest level microcode operating at the chip level, machine code aksing the CPU chip to store values, shift values, add values etc., assembly language code, operating systems, abstraction layers, and finally applications. Users interact with the hardware at the application and operating system level. It gets very tedious if you have to write a program in assembly language or machine code (Yes, I’ve had to do both as a graduate student in computer science).
To be continued…