Data-Driven Art Installation: Sensorify #artinstallation #react #touchdesigner #websockets

Sensorify: Data-Driven Art Installation – Part 2

In Part 2 of Sensorify, I explain the algorithms behind my data-driven art installation. This project combines software engineering with artistic expression, creating an interactive experience through real-time data.

1. Facial Expression Recognition:
Using the React framework and the face-api.js library, pre-trained models recognize and analyze facial expressions. This technology allows the installation to respond dynamically to the audience’s emotions, making the experience more engaging.

2. WebSocket Server:
A WebSocket server transmits data between applications instantly. This seamless data flow ensures that all components of the art installation work together in real-time, providing a smooth and responsive interaction.

3. TouchDesigner:
TouchDesigner creates and drives the visuals and 3D models. It transforms data inputs into stunning visual effects, making the installation visually captivating and responsive to real-time changes.

4. Arduino Light Control:
Arduino controls the external lights, adding an interactive lighting element to the installation. The lights respond to data-driven cues, enhancing the overall sensory experience.

Sensorify demonstrates the power of combining art and software engineering. By integrating facial recognition, real-time data transmission, interactive visuals, and light control, this art installation offers a unique and immersive experience.

About The Author