Turning Vision into Music | TouchDesigner Experiment

A brief experiment in turning vision into sound.

Using TouchDesigner 2020.24520, this network transforms a standard webcam into a live MIDI controller: RGB values from the video feed trigger note events, while UV data from blob tracking modulates filter cutoff and resonance in Ableton Live 10 Lite.

The result is an interactive visual instrument — wave your hand, move through space, and the music responds in real time.

All data runs through a loopback MIDI connection, exploring the evolving relationship between motion, colour, and sound. This experiment is an early step in a larger project aiming to convert any signal into expressive MIDI control.

⚙️ Tools used:
• TouchDesigner 2020.24520
• Ableton Live 10 Lite
• LoopMIDI
• Standard webcam input

Part of an ongoing series of experiments connecting vision, generative control, and live sound performance.

🎧 Reality Studios Inc. — Experimental Media Lab

About The Author