Steve Zafeiriou

I built a motion tracking system for Touchdesigner

Mark Tutorial Completed! A tutorial and process of how I built a custom motion tracking controller for Touchdesigner. 👉 Written in-depth guide: https://stevezafeiriou.com/how-to-send-sensor-data-to-touchdesigner/ 👉 Setup guide & functionality of the MPU6050: https://stevezafeiriou.com/mpu6050-sensor-setup/ 🥳 Project files on GitHub (code, schematics, 3d models): https://github.com/stevezafeiriou/sensor-data-td (You can use this system for multiple use-cases, from IoT to VR controllers […]

LilyGo T-Display S3 – ESP32 Motion capture system for Touchdesigner and IoT projects #motionsensor

Mark Tutorial Completed! Interactive media isn’t just about visuals.. It’s about creating a real-time dialogue between the digital and physical worlds. Traditional setups force you into messy wiring, but what if one sleek board could replace the need for super expensive equipment? Im experimenting with the ESP32S3, that has a built-in display, dual-core processing power, […]

I’m developing a motion capture system (compatible with Touchdesigner w/ websockets)

Mark Tutorial Completed! Interactive art has evolved dramatically, with real-time sensor data now driving approximately 75% of installations. This shift has opened up new possibilities for immersive environments, interactive performances, and even experimental IoT prototypes. At the core of these experiences is the ability to seamlessly transmit sensor data from microcontrollers to creative software like […]

LilyGo T-Display S3: sensor data to Touchdesigner using WebSockets

Mark Tutorial Completed! The ESP32’s built-in wireless communication capabilities are highly suitable for interactive media projects that require fast, bidirectional data transfer. When paired with TouchDesigner, which excels at real-time visual programming, the ESP32 can stream motion and environmental data for visualization or audio generation. This project is designed with a modular architecture, enabling adaptation […]

How to Send Sensor Data to TouchDesigner Using ESP32: A Step-by-Step Guide

Mark Tutorial Completed! I’m preparing a full guide on how to send data wirelessly to @touchdesigner using microcontrollers, websockets. If you’re ready to bridge the physical and digital worlds, this guide will show you how to send sensor data to TouchDesigner using an ESP32 and an MPU6050 Gyroscope. Whether you’re creating kinetic sculptures, immersive environments, […]

Intergrade Arduino to TouchDesigner – Demo

Mark Tutorial Completed! Are you seeking to intergrade the physical interactivity of Arduino or any other microcontroller with the creative capabilities of TouchDesigner? Read my full guide that provides the foundational steps to integrate these tools for developing dynamic and interactive projects. From Arduino sensors and actuators to TouchDesigner interface design, this resource covers the […]

How to integrate riding to Touchdesigner

Mark Tutorial Completed! In interactive art, combining hardware and software opens new possibilities. Arduino and TouchDesigner are powerful tools for creators seeking to explore this intersection. Arduino, a versatile microcontroller platform, allows you to collect data from physical sensors—temperature, light, sound, motion, and more. This data can be seamlessly transmitted to TouchDesigner using serial communication. […]

Interactive Installation using #Touchdesigner and #reactjs

Mark Tutorial Completed! Presenting Sensorify v2.0. In this update, the interactive installation artwork uses machine vision to detect and determine human emotions through facial expressions. The installation uses this data to showcase the ability of machines to digitize human sensations and experiences. This artwork reflects a dystopian scenario where human communication takes place entirely through […]

Data-Driven Art Installation: Sensorify #artinstallation #react #touchdesigner #websockets

Mark Tutorial Completed! Sensorify: Data-Driven Art Installation – Part 2 In Part 2 of Sensorify, I explain the algorithms behind my data-driven art installation. This project combines software engineering with artistic expression, creating an interactive experience through real-time data. 1. Facial Expression Recognition: Using the React framework and the face-api.js library, pre-trained models recognize and […]