In-situ live coding

— code as instrument, environment as performer

Till Bovermann

One of the most important characteristics of live coding is indirect sound generation: Instead of generating sound directly by manipulating physical elements of an instrument, to live code is implementing and adapting parts of the programme which then in turn generate sound.

This is fundamentally different from traditional instruments, where the physical properties (and their manipulation) directly determine the sound that is produced and when. One could say that live coding is similar to building an instrument rather than playing one.

This feature makes it possible to decide who plays the instrument: The live coder can either play it themselves, automate it, or leave the playing to someone else. That other player can be as specifically unspecified as " the environment around me". All that is necessary to do this is a sensor that detects one or more environmental properties and feeds them into the live coding system.

My setup for live coding as of 2023.

My setup for live coding as of 2023.

What makes this interesting in terms of artistic fieldwork for a sound artist who is interested in their environment and its contexts? For one thing, the explicit distinction between playing and processing an instrument makes it possible to assign roles to different entities: While I can be the one who makes the instrument, I can have it played by my environment. I therefore give the environment a voice with which it can express itself. For another, the sometimes arduous process of live coding is an opportunity to slow down, take time to listen and observe.

My live coding sessions on location often last hours. I engage with the environment in an intense way; something I don’t do when hiking or walking.

Learnings Link to heading

  • Distinguishing between making and playing: Live coding makes it possible to distinguish between simultaneously manipulating and playing an instrument.
  • Giving the environment a voice: The environment can be an actor in a live coding setup if it is equipped with sensors that introduce environmental characteristics into the system.
  • Deceleration: Live coding can help to slow down, take time to listen and observe.

System setup Link to heading

As I often work in remote locations without a power connection, it was important to work with a device that is portable and runs on batteries.

This setup may be summarised in three parts:

  1. For the actual live coding, I had a laptop running SuperCollider, a battery-powered speaker, a multi-touch controller and a MIDI controller.
  2. For the environmental sensing, I used a portable device that measures environmental properties such as temperature, humidity, light or sound. The sensor elements were connected to a microcontroller that sent the data to my laptop wirelessly.
  3. I also brought a portable audio recorder with multiple microphones to record the environment and my live coding session.

Spatial setup Link to heading

An example setup for live coding at Moss and water. Note the relative closeness of speaker and microphones.

An example setup for live coding at Moss and water. Note the relative closeness of speaker and microphones.

The spatial arrangement was crucial for the live coding sessions. I often placed the setup close to a sonically interesting place, such as near a body of water or inside a forest. The speaker was placed so that the sound would interact with the environment, for example by pointing it at a crevice or such that the it would be reflected off the surface of the water. I was playing in relatively quiet environments, which made it important to place the speaker and microphones in close proximity to each other and play at a volume that was audible but blended in with the environment. This technique ensured that the microphones picked up both the broadband signal of the live coding sounds and the quiet ambient sounds.