Audio Reactive Mask in TouchDesigner
In this project, I created an audio-reactive mask using TouchDesigner and MediaPipe, with real-time face tracking driven by my own facial expressions using a standard webcam.
The facial landmark tracking comes from the MediaPipe-TouchDesigner integration by Torin Blankensmith, which uses machine learning to detect and track face geometry. I’m using my head movements and expressions to animate the mask, while audio input drives reactive noise effects over the mesh.
This is a real-time experiment combining:
Facial expression tracking (via webcam)
Audio input to control visual noise
Procedural animation in TouchDesigner
Tools Used:
– TouchDesigner
– MediaPipe Face Mesh
– Audio CHOPs & Noise SOP
– Standard webcam
– Real-time input & reactive visuals
GitHub Link (MediaPipe-TouchDesigner): As soon as I am verified it can show.