I’m developing a motion capture system (compatible with Touchdesigner w/ websockets)

Interactive art has evolved dramatically, with real-time sensor data now driving approximately 75% of installations.

This shift has opened up new possibilities for immersive environments, interactive performances, and even experimental IoT prototypes.

At the core of these experiences is the ability to seamlessly transmit sensor data from microcontrollers to creative software like TouchDesigner.

In this project I’m building a full open-source motion capture system using an ESP32, an MPU6050 IMU sensor and a Vibration motor (for haptic feedback). The system is modular and you can optimise it to fit in your use-case. Soon I’ll publish a full tutorial on Youtube and a full guide on my blog, until then you can find all the code, project files and more into my GitHub. Keep in mind-I’m still pushing updated to the repo.

✅ Why does this matter?
By bridging physical inputs and digital outputs, creators can design installations that feel alive—whether it’s a projection that shifts based on a viewer’s movement or an interactive sculpture that responds to sound.

✅ Modular Design = Limitless Applications
A well-structured modular system makes it easier to adapt the technology to different contexts.

Whether it’s a live dance performance, a museum installation, or a blender animation rig, the same foundational setup can be repurposed with minimal friction.

The real challenge isn’t just implementing the technology—it’s thinking about how interactivity enhances storytelling.

The best interactive experiences don’t just respond to movement; they create meaningful connections between the audience and the artwork.

How are you integrating interactivity into your creative projects??

About The Author