Manual/Interactive Performance
25/05/23
Performing the modular - WITH the Kinect
Experiments following outlines for onset detection from Tim and the aforementioned article turned out to be generally fruitless. Mostly trying to devise onset detection from latent data coming from Wekinator, I was unable to create an environment that reliably generated triggers at replicable places in space. While there may be more opportunities to explore in onset detection of specific body parts, using raw data from the Kinect, my past experiences with those methods and alternate methods (such as thresholds in space) have not resulted in effective embodied experiences.
What has resulted in such an experience, however, has resulted from a shift in thinking regarding how the Kinect may fit into performance with a modular from the outset. Up until now, I have generally viewed using the Kinect and operating the modular as separate tasks - I should leave the modular to use the Kinect. This has resulted in a compromise on controllability of the system, as inevitably there will be operations that need to be completed by hand. Are these to be left out of the performance?
Last night, while using the system in my small studio at home, the tight space forced me to provide examples to Wekinator where my body was positioned to operate the modular. Mapping these operational body positions revealed a new value in the system; it provides an extremely convenient avenue to expressively control a number of parameters at once. While this has been obvious from the outset, I can see that I have been focussed on the physicality of the system rather than its immediate musical benefits - the ability to involve another medium, i.e. physical expression/dance, for example. The session resulted in an interactive sonic space based around melodic plucks generated by a Resonator, an FM bass line and drum samples. The timbral and temporal qualities of the plucks are largely movement controlled, whereas the tonality is achieved through a keyboard and arpeggiator.
Performing the modular like I normally would, controlling parameters and changing the patch, but with the added dimension of movement interaction was a very rewarding experience. Even subtle movements made in the using of the synth created pleasantly sized sonic effects, like a nice light generative modulation. More dramatic effects could generally be achieved without needing to touch the synth, allowing me to alter my body position more drastically and finding deeper parts of the space. Recording examples mapped to dramatic changes in reverb, delay feedback and filters worked well for this more distant reaches of space.
As for trigger generation, a change in thinking encouraged me to explore a new approach to this as well. In considering my use of the modular, I realised that I often don’t even provide ‘triggers’ all that often anyway, and that the design of modular is often to escape having to direct events to happen on command. One of the parts I love about the format is how it’s geared to create environments in which triggers occur in patterns, and using continuous control input (manual or otherwise) to change the nature of these environments. Firstly experimenting with manipulating body data to control the frequency of LFOs in Touchdesigner, I added a layer of complex mapping that resulted in LFO generated pulses that could be denser or sparser depending on Wekinator. However, even this didn’t feel totally elegant, and another pursuit resulted in a configuration with different strengths. This was controlling parameters of Pachinko (Marbles) with CC data to generate triggers. The key parameter control in this instance was the Rate, which added a high degree of control to the body and created a considerably embodied experience. This supports the findings I have had with designing the bass voice, in which temporal changes achieved through movement seem to be the most effective for embodiment. While timbral changes are certainly noticeable and rewarding as well, the changes in intensity created by temporal changes are much more effective at inspiring movement reactions - for me, at least. When things speed up, you feel as though you’re squeezing the sound, when they slow down you feel like space has expanded. Exploring these areas is effective both musically and as an engaging user experience. Examples in the recording above of timbral control are evident in the dampening, filtering and delay feedback of the Resonator generated plucking sound, while temporal changes are apparent in the sound’s rhythm.
This session resulted in a handful of other notable discoveries as well:
Musical bass lines can be achieved generatively by patching the keyboard pitch output to the ‘Listen’ input of Marbles.
To temporally effect the Resonater independently, modulation of a clock divider presents an effective solution if bipolar CV is possible.
To create more rhythmic variation, a clock probability device can be applied to the clock divider output. This also opens up another possibility for modulation.
All in all, these lightbulb moments this week have reinvigorated some progress in the project. I’m pleased to be moving towards musicality, as when experimentation is seeded from an artistic rather than theoretical desire it has all a higher chance of being relevant experimentation.
Until then,