We worked towards the exploration of digital and analog material via sound and interfaces. Being just at the very start, we happily explored the two parts on their own without actually combining them. I am, however, quite happy about the outcome and especially that research into the digital part of the process started.
The next two paragraphs give a rough overview of what we did today. Please excuse the maybe a bit rough outline; I wanted to write about it when it is still fresh… (I realized that I did not write on the last research day with Erich at all, a pity since we did interesting things; that’s why I tried to integrate it into this post).
For the physical part, we decided to first stick with the idea of hanging long sheets of vellum from the ceiling as shown in this [intlink id=”768″ type=”post”]post[/intlink]. This time, however, we did not stick to sonic sweeps, but used only a feedback loop. We started this in a first meeting last week and continued in this direction, today.
We started to think about possible directions of research and came up with this list:
- How does a variation of a printed or embossed pattern might sound like?
- How does the used raw (physical) material affect the sound?
- How influences microphone and speaker placement the system’s responce?
- What about the length of the material sheet?
- How to visually amplify the appearing structures on the sheet? How to integrate visual projection?
- How can several of these systems be connected to integrate them into a larger installation?
The image on the right is from the earlier investigation, as are the sounds.
Touch based audio livecoding in assembler, for the Nintendo DS.
It basically consists of a stylus-based interface, a graphical representation, a sound engine, and a core (virtual) machine. Th latter has several threads, each having a stack for temporary data storage. They share a common heap, in which data and code reside side by side. Since there is no difference between data and code in the heap, as it is suggested for von Neumann machines, it is possible to alter the code while it is running. To get a glimpse of the complete system, a look at the video of Dave performing is highly advisable. Dave also performed with it at the TAI Studio Opening.
Today, however, we tried to get the core machine running on my laptop. We managed to do this and I can proudly present my very first program here:
// begin LFSaw u32 threadID = m.add_thread(23); m.poke(23, ORG); m.poke(24, PIP); m.poke(25, 6); m.poke(26, JMP); m.poke(27, 1); // end
It creates an 8bit low-frequency sawtooth. Now, the idea is to integrate it into SuperCollider as a UGen. We’ll see how far we get with this.