22nd – 24th May 2003 :: McGill University, Montréal, Canada
Firstly, I’d like to say a very big thank you to ANAT for facilitating my attendance and participation in NIME 03, the 3rd International Conference on New Interfaces for Musical Expression hosted by McGill University in Montréal. The conference presented the latest scientific and technological research in the field, along with several concerts displaying the creative and artistic applications of the technology. The conference was three full days of papers, demos, posters and reports along with three evening performances.
I was very fortunate to have both a joint paper with Ian Stevenson accepted for presentation along with a solo performance using my new interface, the eMic (Extended Mic-Stand Interface Controller), developed in collaboration with Ian Stevenson. The eMic is a modified microphone stand, enabling a vocal performer to carry out real time computer processing of the voice. The microphone stand is fitted with an array of sensors aiming to capture the most commonly used gestures of vocal performers who use microphone stands. The eMic as an idea was very well received at the conference, delegates seeming to like the potentially broad applications of the device and the bringing together of more ‘academic electro acoustic music’ with popular genres. The opportunity to exchange ideas with other researchers proved invaluable, with Ian and I receiving some excellent suggestions from other delegates on technical issues relating to the future development of the eMic prototype, including wireless systems and alternate sensing technologies. The audience feedback and responses to the performance also proved useful in helping to identify the successes and shortcomings of the eMic in a performative sense. Some of the strengths were: 1) the novelty of a vocal interface, since there are very few dedicated vocal controllers in existence, 2) the inherent visual interest generated by the use of typical mic-stand performance practices and 3) the expressive aspects of distance based hand sensing. Some audience members expressed a desire to see more live vocal input, which serves to provide a more direct connection between gesture and sonic outcome. Overall, participants were very excited by the possibilities presented by the interface and were keen to track further developments.
One of the conference highlights for me was a performance by Michel Waisvisz from STEIM (Netherlands) whom I met and conversed with. He is renown for his alternate controller, The Hands, which was one of the first MIDI controllers to break away from the keyboard. The Hands has been under development for twenty years thus representing a highly developed performance practice with an alternate controller. The interface consists of a number of sensors and keys, mounted on two small keyboards that are attached to the player’s hands. The combination of many different sensors captures the movements of the hands, the fingers and the arms. His performance was particularly relevant to my work with voice since he had a small microphone on the Hands, which he used to sample his voice during the performance. He used his voice as a sound source, processing and working the material via his hand gestures. I found his performance inspiring and informative in that it allowed me to observe effective strategies for mapping gestures to sonic outcomes.
The concert series provided a concentration of high-level work in the field and was a fantastic opportunity both attend and participate in. A highlight of the conference for me was the first-night concert by the Wireless Duo, who used alternate controllers (including the Buchla Lightning and the Theremin) to perform their score for the silent movie Faust, by F. W. Murnau (1926). Another highlight was a workshop by Curtis Bahn and Tomie Hahn on their collaboration known as Pikapika. The workshop included a very beneficial discussion on performance and mapping strategies along with a performance demonstration by the dancer Tomie Hahn, who adopts the cyborg persona known as Pikapika. She wears a wireless MIDI control interface as well as a small wireless stereo amplifier and arm-mounted speakers. This was a highly rehearsed, impressive performance again with very effective mapping of the bodily gestures to the sonic outcomes, providing me with much inspiration for future compositions and performance strategies with the eMic.
The research presented was diverse and included hardware and software developments. There was an array of alternate controllers based on existing instruments as well as purpose built controllers, DJ interfaces, motion capture systems, musical creation using chemical reactions, improvisational devices for children and a very interesting interactive system called Sonic City which enables users to interactively create music by walking through a city, factors like noise and air pollution being used as musical parameters.
The conference overall was very stimulating, I had the opportunity to participate in discussions with some of the key figures in the new interfaces field. As usual the informal contact over lunches, dinners and in the corridors was as valuable and significant as the formal sessions in that it allowed me to further links and contacts and discuss specific aspects of the eMic project with high profile colleagues. One such contact was Professor Joel Chadabe, an eminent figure in the field who has invited me to collaborate with him on a work utilising the eMic.
Thanks again to ANAT for the opportunity to attend and present at this very unique international event.
Tags: performance practices, sensing technologies, sonic interface, wireless systems