July 2007, Amsterdam Netherlands + Sheffield UKIn July 2007 I traveled from Australia to the Netherlands and the U.K. with the support of ANAT’s professional development travel fund. The purpose of the trip was twofold: (1) to undertake a residency at STEIM, Amsterdam in collaboration with Australian artists Somaya Langley and Danielle Wilde; and (2) to attend the LOSS Livecode Festival in Sheffield, U.K. Both activities were successfully undertaken and have been of great value to the ongoing development of my practice, as evidenced in the following reports.
Many thanks to ANAT and the Australia Council for assisting with my travel costs, to STEIM for providing facilities and accommodation for our residency, and to Access Space for supporting my participation in the LOSS Livecode Festival.
STEIM Residency
STEIM is an independent organisation established to support performance practice in the electronic arts. Based in Amsterdam, STEIM’s facilities include two recording spaces with an adjoining performance/rehearsal space, specialised workshops with expert technicians in electronics, mechanical engineering and software development, and multi-purpose artists workshops available to visiting artists. STEIM also maintains a guest house which provides accommodation for resident artists and visiting performers.
From July 9 to 30 this year, together with Australian artists Somaya Langley and Danielle Wilde, I undertook a residency at STEIM. The goal of the residency was to explore and experiment with new methods for controlling and performing computerised sound using whole-body gesture. Each of the collaborators came to the project with different expertise, approaches and expectations. In my own case I am interested in creating systems which support a kind of kinaesthetic-auditory synaesthesia, where human body motion is mapped into sound in such a way that sound production becomes an inherent and unavoidable consequence of moving the body.
From the outset we were clear that the goal of the three week residency was to develop, research and experiment with new techniques and to document our experiments, rather than to produce a completed performance work. This goal was set because the approach and technologies which we intended to employ were new to all of us, as was the collaboration, and we didn’t want to constrain our experiments with the requirement of producing a performance outcome in such a short space of time.
Our approach was multifaceted and reflected the interests of the collaborators. Considerations included: physicality in the space, sonic and compositional form, structure and aesthetics, conceptual semantics, sensor technologies and applications. These concerns were used as the basis for devising experiments, some of which were undertaken without interactive technology. For example in the early phases of the residency we experimented with movement-only composition, and later, some of the sound mappings were prototyped by improvising movement to pre-recorded sound.
The residency focused on two sensor technologies: 3-axis accelerometers (deployed using 7 Nintendo Wiimotes), and a custom wireless ultrasonic range finding system which we developed to measure the distance between performers. Both sensor systems fed a computer running max/msp software which outputted Open Sound Control Protocol which was further processed in a custom version of AudioMulch using the Lua scripting language to specify the mappings between sensor data and sound. This approach was quite successful, although some technical limitations were encountered (for example we had some difficulty using six Wiimotes at once, and three stopped working over the course of the residency).
The residency had a duration of three weeks. The intended schedule was to complete the development of hardware and software systems in the first week in an artist’s workshop, and then to work in the large performance studio experimenting with the system for the following two weeks. In reality it was necessary to continue to work on technology development concurrently with performance and movement work throughout the duration of the residency. Overall we could have used more time, since the ultrasound system only became functional in the final week.
On the final Friday of the residency we invited peers to a work in progress presentation. Approximately 8 people attended the session including sound artists from the Amsterdam experimental music scene, Australian sound artists/producers and non-locals from the USA and Berlin. During the presentation we were demonstrated a representative sample of the range of experiments we had undertaken. Feedback was positive, and people were generally impressed with the musicality of the experiments. Overall we found that the musicians in the audience were less interested in performances with strong conceptual or theatrical basis if the sound itself was too simple, or too predictable. It is clear that further work remains to fully comprehend the feedback and outcomes we produced.
I believe that the experiments undertaken during our residency at STEIM have provided me with a valuable foundation for future work in terms of raw ideas and a framework with which to proceed creatively. The process has also given me significant insight into the limitations of my present capabilities as a physical performer, which is something I am interested in addressing in the future.
Further information about this project will be made available at
www.audiomulch.com/~rossb/steim2007
LOSS Livecode Festival
The LOSS Livecode festival was organised by Access Space, a community arts computing lab in Sheffield, UK in conjunction with the loosely confederated league of programmer/performers known as TOPLAP “(Temporary|Transnational|Terrestrial| Transdimensional) Organisation for the (Promotion|Proliferation|Permanence|Purity) of Live (Algorithm| Audio|Art|Artistic) Programming.” Access Space’s mission is to provide infrastructure for creative people to interface with technology – they do this using only recycled hardware and free and open source software, hence the LOSS (Linux & Open Source Software) moniker for the festival. That said, the range and scope of software used by festival participants also included commercial Apple and Microsoft offerings, open source tools such as Pure Data and Supercollider (audiosynth.com), and closed source software written by participants such as Dave Griffiths’ Fluxus (www.pawfal.org/nebogeo/) and Andrew Sorenson’s Impromptu (impromptu.moso.com.au).
The Livecode festival grew out of a previous LOSS music CD project, which exposed Access Space to “Live coding” –the performance of electronic art (sound, music and graphics) by writing and modifying programs. The Livecode festival brought together an international group of approximately thirty live coding practitioners and related artists for two days of workshops, lecture-style presentations and performances.
The festival began on Friday afternoon with a workshop on live coding techniques. Dave Griffiths and Julian Rohrhuber gave a comparative exposition of two live coding environments: Dave’s “Fluxus” and the popular open-source “Supercollider,” to which Julian has made a significant contribution over the years. The workshop provided participants with an overview of how these environments can be used for live coding and gave some insight into reading the code and understanding how it behaves – the importance of understanding how live coding environments work should not be underestimated since reading and following code as it is “performed” is one of the main features of the live coding performances. In fact some adherents refuse to perform unless their code is projected during the performance. In addition to providing an induction into the practicalities of live coding, Friday’s workshop also gave participants an opportunity to get their hands dirty with Fluxus as a real-time computer graphics live coding environment. I made a number of computer animations. Overall I find the experience of interacting with code in a spontaneous and performative way to be a great way to stimulate creativity and generate new ideas.
Friday night’s performance was held at the pub across the road from Access Space. This was a traditional UK pub with pints of beer and a lot of locals talking about cricket – not what you might usually imagine as an ideal venue for an audio visual performance, but interestingly it worked quite well. The main reason being that it was a “headphone” performance: Everyone brought their own headphones and the organisers provided a large number of headphone distribution amplifiers along with looms of tens of headphone sockets which were distributed around the audience space. Everyone plugged in, and even some of the locals had a chance to listen to the parallel acoustic world being performed alongside the Friday night cricket chatter. Performances ranged from livecoded techno dance by Frederik Olofsson to abstract sound processing of George W. Bush speeches by myself.
For many of us, Saturday began with a traditional English breakfast at the main festival accommodation. The topic of discussion over breakfast was the range and diversity of aesthetics and approaches to live coding. These approaches include the creation of chaotic self-modifying systems which are very difficult to control and require a fight to perform with, to quite “pure” programming systems where the purpose of the performance is to sonify some abstract algorithmic behavior, to systems designed with the goal of facilitating a specific musical outcomes. For me this was probably the most interesting discussion to come out of the festival.
Saturday day was taken up by an extended talkfest with many festival attendees giving 20 minute presentations related to live coding. Topics ranged from explanations of live coding systems (which are often quite personal and used by a limited number of people beyond the creators) such as Thor Magnusson’s ixiQuarks (www.ixi-audio.net/content/download/ixiquarks/) and Andrew Sorenson’s Impromptu, to discussions of more abstract concepts such as Julian Rohrhuber’s presentation of ideas about interacting with the history of executed code during performance; Fredrick Olofsson’s experience in “practicing” live coding for an hour a day, every day for a month; and Click Nilson’s discussion of live coding for humans executing written instructions which included a concurrent demonstration of this idea by the audience.
Saturday night’s festival gig included performances by many of the festival attendees. Highlights for me included a minimalist electronica performance by Andrew Sorenson from Brisbane, using his Impromptu system; a performance by the Slub trio which included Dave Griffiths performing using his gamepadcontrolled animated robot music sequencer; and Yee-King live coding Supercollider while simultaneously performing on electronic drums.
The LOSS Livecode festival was a great success and a valuable contribution to the evolution of the relatively young practice of live coding. Over the course of the festival I had an opportunity to make contact with most of the festival participants and organisers, many of whom I have not met before. Through attendance at this festival I have significantly increased my understanding of livecoding theory and practice and feel that this has been a great way to participate in a community which has many adherents in Europe and far fewer in Australia.
Tags: electronic arts, kinaesthetic-auditory synaesthesia, sensor technologies