BLOOMING MODULATORS

Using the front facing LiDAR (FaceID) camera on iPad and the ZIG SIM PRO app, I'm tracking detailed movements of my own face. The data is then sent to Touchdesigner via OSC protocol over IP. By using various custom-defined Math CHOPs in Touchdesigner, I can translate the input data of each feature point on my face into a range that is recognizable by MIDI instruments and effects. The modified data is then sent to Ableton Live via the recently released TDAbleton component. In Ableton Live, I can map each number range to anything that can read MIDI; basically any button, knob, fader, trigger or key on all instruments and effects. Additionally I can use Live's CV Tools plugin to convert those number ranges to Control Voltage and run a modular synth rig on the side.

In the video below, I'm using an ambient mic and a wavetable synth to generate sound. I've mapped the wave and filter positions to my face's distance from the camera, the panning of the synth to my head's left/right rotation, and the level of the mic to my head's up/down pivot. There's also a bit reduction effect in the chain which gets activated when I open my mouth; the reduction gets more extreme the wider my mouth gets.


Built during the Micro Residency program at New Media Gallery.

New Westminster, BC, Canada | June 2021

Back to All ArtworksBack to All Artworks