Tuesday 6 September 2011

Together as one? A Response to Eric Schmidt

I read with great interest, and increasing enthusiasm, a transcript of Eric Schmidt’s MacTaggart lecture, given to the television industry’s great and good in Edinburgh a couple of weeks ago. I don’t watch a huge amount of telly but I was intrigued and excited by what by what he was saying about the relationship between science, engineering, humanities and the arts. He made one central point very clear: “you need to bring arts and science back together”. The good news that I have for Mr Schmidt is that there have been trends towards just this in higher (and other areas of) education, the bad news is that they’re often denigrated as impure and sometimes dismissed as confetti. Sometimes the denigrators and the dismissers are right, often they are wrong. However, it strikes me that a key question at the centre of this debate is not whether but when should you bring the two together.

I work in an Electronics Department which has been turning out successful Music Technology Engineers for over two decades and generating research output in this area for just as long. On the other the side of campus from me stands a Music Department which established its first electronic music studio almost half a century ago and with whom we have collaborated in teaching and research. I gained my first degree from a Music Department which brought the Tonmeister (‘master of sound’ for those whose knowledge of German is, like mine, non-existent) concept to the UK for the first time. Music Technology has many applications – from the recording studio to the doctor’s surgery (look-up ‘sonification’ if you’re wondering about the latter) and I do not see its importance diminishing. The term ‘music technology’ is a recent one, but the use of science to create, distribute and understand music is hardly new, neither is the application to different fields of the knowledge and technique acquired in this endeavour. We have witnessed the ability of music to enact (or at least powerfully symbolise) social and political change and, conversely, we have also seen how quickly our notions of what music is and how it is consumed can be affected by technological developments. In many ways these have presaged what Schmidt talks about in relation to TV.

There are some who are suspicious of multidisciplines such as music technology, concerned that any generalist who sits across many fields of study cannot be a master of any of them. But these generalists are just the people who can make connections in knowledge and technique which cross traditional boundaries of study. It fascinates me that the ongoing search for mathematically ‘sparse’ representations of audio seems to be leading us back to that centuries-old musical storage device, the score. It was a music technologist who recognised how a method to squeeze more simultaneous conversations out of mobile telephone networks could also improve the sound of concert halls and it was another who made the connection between (then rather abstract) digital filters and the physical musical instruments they could be connected together to intuitively simulate.

But what makes a music technologist (or a media engineer or an archaometrist for that matter)? In order to combine disciplines there must first be ‘a knowledge’ of them. I learned to read, had piano lessons, sang in the choir, created havoc in school science labs, murdered Shakespeare, fiddled with fractions (which are key to understanding why we usually divide the octave into twelve notes, by the way), floundered in languages (hence the limited knowledge of German) and finally arrived at the A-levels in Music, Maths and Physics that I needed to study Music and Sound Recording. This was at a time before all but the wealthiest schools had access to recording equipment and there was no Music Technology A-level, and I’m not sure that they would have been as beneficial to me as those subjects that I did study. Even when I did put these sets of knowledge together, in subjects such as Recording Techniques and Electroacoustics, I also continued in my first year at University to study Mathematics, Harmony and Counterpoint and the piano separately. On the courses that I teach on here at York we do the same: Maths, Programming and Analogue and Digital Electronics are taught as core disciplines whether to aspiring music or nano technologists, but we also begin to wrap elements of different disciplines together soon after our students first arrive. Music Technology Creation and Perception encompasses the Physics and Psychophysics of Sound, Analogue and Digital Sound Synthesis and Spectral Analysis and the work that students undertake reflects this multidisciplinarity (if you want to investigate human responses to sound, you have to design and create sound first).

The potential students we are looking for have an understanding of music and the needs of musicians and can demonstrate a practical aspect to their interests as well as an intellectual one. However the formal qualification that we place most emphasis on is mathematics because it is such a common thread running through Music Technology. This is something we prize more than previous study, at school or college, of Music Technology itself because it is something that equips our students to seek out and understand the connections between subjects such as Music and Physics for themselves. When I see courses in Music Technology that focus heavily on learning and using a single piece of software I can’t help feeling that this is like running a Creative Writing course whose primary focus is current technology for word processing. School children can, and do, use ‘professional’ music and audio sequencing software so I’m not sure that such a strong focus at higher level is worthwhile; training is one thing, education is another. Those who can use, but do not fully understand the workings and context of, current technology will be forever constrained by it and will not be able to make the bold leaps and reveal the hidden connections that characterise the best of multidisciplinarity.

Yes, we “need to bring arts and science back together”, but we also need to consider how and when it is best to keep them distinct so that symbioses between them can continue to flourish.

Thursday 1 September 2011

And we're off!

Today is the official start date of the Fellowship, which will be running for the next 11 months. Dave Beer and I have already been doing some preliminary research looking at how those who refer to themselves as recording engineers influence the sound of the music that we consume, and how recording as an activity/career positions  aligns itself with related disciplines such as engineering and music.

Planning is underway for the first consultation event which will take place in York (with others to follow, including at the Royal Academy of Engineering in London). As soon as details are available they will be posted on this blog and elsewhere on the web. If you're interested in attending then please drop me an email (jez[dot]wells[at]york[dot]ac[dot]uk)