Introduction and Workflow

Hi there, and welcome to my research journal. This is a structured data generation process for my current Honours research project at the Queensland Conservatorium of Music, Griffith University that is investigating movement interaction in audiovisual performance with hardware electronic music equipment. I am conducting artistic research through my own practice as an audiovisual performer, using a Microsoft Kinect v2, Touchdesigner and a modular synthesiser to design and evaluate applications for movement interaction in performance. To summarise, my project seeks to answer the questions:

To what extent may movement interaction be used in audiovisual performance with a modular synthesiser?
What effective methods exist for the translation of movement to parameter control?
How might movement interaction benefit the experiences of the performer and the audience in the audiovisual discipline?

To answer these questions, I am investigating techniques for movement capture, interpretation and mapping through an iterative action/reflection process, where techniques are evaluated in terms of their embodied experience and artistic benefits.

Particularly influential on the project so far have been academics Louise Harris and Tim-Murray Browne. Louise’s book ‘Composing Audiovisually’ has given me the vocabulary and frameworks to discuss my work as being ‘audiovisual’ while providing several heuristics that encourage an egalitarian approach to sonic and visual material, which I am excited to engage by designing both mediums with a common source of data: the human body. While not audiovisual, Murray-Browne’s work ‘Sonified Body’ is among the cutting edge of movement interaction and is the most effective movement based sound control system I have encountered in the discourse. Tim was kind enough to speak with me concerning my project and has greatly informed my use of Interactive Machine Learning (IML), with especially beneficial insight into best example recording and data preparation practices.

This blog will be updated weekly with new ideas, experiments, evaluations and technical demonstrations of the system as proceed with the study. Until now, I have conducted technical trials to land on a reliable workflow that enables robust experimentation with specific musical applications. The workflow can be described as thus:

  1. Full skeleton interpretation from the Kinect in Touchdesigner, 75 points in total (25 body points in x, y, z axes).

  2. 75 data points are processed in Touchdesigner to calculate body point positions relative to the Hips, with Hips maintaining objective space coordinates.

  3. Data is sent via OSC to Wekinator, a machine learning software developed by Rebecca Fiebrink.

  4. Wekinator is used to train 16 models based on input data from the Kinect. Each model is intended to control a single parameter in the modular (or visual) systems.

  5. OSC outputted from the 16 models in Wekinator is sent back to Touchdesigner to be converted to MIDI, with additional processing where necessary (eg. scaling, integers).

  6. MIDI is sent to Endorphin.es Shuttle Control module to be converted to CV.

  7. CV controls key parameters in the modular, eg. LFO frequencies, clock divisions, generative or effect controls.

  8. Models from Wekinator are also used to control visual system parameters, however it is intended these will have their own models in future.

A simple in-studio test of synthesis parameter control, highlighting the expressiveness of timbral and temporal control on the bass oscillator.


Key discoveries that have informed this workflow include:

  1. IML solutions provide easily programmed explorative and interesting mappings (Murray-Browne etc)

  2. Wekinator is a free and easy to use IML solution

  3. Wekinator needs single sample data sets for examples (use the Hold CHOP in TD)

  4. VCV Rack was initially used for CV conversion - this is not a stable solution. 

  5. VCV Rack does not receive OSC securely, phantom signals were present. 

  6. The MOTU Ultralite mk3 interface was initially used for CV output, this is not an effective solution. 

  7. The Ultralite only outputs approx. 4.8v. 

  8. Endorphin.es Ground Control is a very effective solution, easily able to convert MIDI to CV output from TD. 

  9. Wekinator example training can be effectively conducted in a number of ways. Currently I prefer Wekinator giving a random configuration, interpreting it with a pose, recording an example and repeating. This technique is most effectively completed in layers of sound sources - so meaningful positions may articulate themselves similarly across the movement spectrum. 

While the applications explored in this study will inevitably be specific to my modular synthesiser, I am making active attempts to consider the parameters for what they are in principle - LFOs, envelopes, filters, granular processors, resonators etc. I hope that through this approach my findings can be of assistance to anyone setting out to integrate movement interaction in a musical system, regardless of format. So far, my notes on parameter controls include:

  • Using the 4MS Spectral Multiband Resonator (SMR) as a harmonic base is effective, especially if chordal movement fits in the desired piece. The resonance also has proven itself to be a very valuable parameter, as shifting between field recordings (bubbling water is very effective) and chords creates an immersive feeling soundscape. Finalising the chain through the MS20 stereo filter creates potential for topographic change as well, which is particularly effective with the field recordings. 

  • Then, running the SMR through the uClouds granular processor, set to ‘Spectral Chaos’ gives interesting, abstract timbral control over the sound. Arbitrary mappings of the Size, Decay and Filter parameters create squelchy, sharp areas to move through and can create moving sounds as grains are modulated. This is a very effective textural device, especially in contrast with a harmonic sound source. 

  • On the oscillator, often filling a bass role, interesting timbral control has been acheived through modulating the ‘Timbre’ input on Mutable Instruments Braids. As I have generally used the FM wavetable, this has given a nice drive/richness to the tone.

  • Additionally, a very effective topographical change in the sound has been achieved by modulating the frequency of an LFO on the LP filter of the oscillator. The changes in the frequency of this LFO are a highly engaging sonic effect that adds tactility to movement within the system. 

  • Control of the size, repeats and mix of the QU-Bit Data Bender buffer effect module have also added feelings of tactility to the system, especially as the probability algorithm in the effect results in pitch or percussive envelopes.

With these prior developments in mind, we can begin the journey! To wrap up this first post I’ll provide insight into the beginnings of the visual design process.


Visuals - First Attempts with an old Particle System. 

Today, after preliminary experimentation at KEPK studio with TOP based silhouette processing, I have introduced the visual element to the project for the first time. Rather than beginning with the body silhouette, which provides a very simplistic avenue for movement visualisation, I began by copying the IML values coming from Wekinator. Building a SOP network from scratch based on itterative geometry, hoping to control rotation/manipulation of the geometry, proved to be fruitles quite quickly. I instead repurposed an old interactive particle system and re-mapped the Wekinator models to what were MIDI controls. The results were instantly effective. Mappings of the wind, turbulence, drag and TOP feedback system enabled intuitive and exploratory control over chaos, order and impactful structure. 

The one immediately apparant drawback is that ‘sweet spots’ in the visual interactions don’t always / inherently match up with ‘sweet spots’ in the sonic system. This can only be remedied by adding an extra layer models coming from Wekinator that can be trained for semantic congruence. For the next session, I would like to build a new particle system from the ground up with the updated ParticlesGPU component and pay more attention to the mapping practice. For now, this initial experiment with the older particle system has proven to be very beneficial for indicating the next areas of exploration.

Until next time,

Fin

A demonstration of particle system control using models from Wekinator.

Previous
Previous

Body Visualisation with Point Clouds