Depth Sensor Diffusion – TouchDesigner, Kinect v2, and DayDream API for Transforming Real 3D Spaces

One of the limits of running StreamDiffusion locally is that it wants more VRAM and oomph than your GPU can keep up with, which really limits how you can add different TouchDesigner processes in parallel with real-time image generation. Using DayDream’s API and cloud computing finally opens up those possibilities for the first time.

My Depth_SensorDiffusion experiment involves:
– Kinect v2 for depth point cloud and RGB camera
– Remapped StreamDiffusion XL output over 3d instancing
– Xbox type controller mapped to have drone-like exploration of the 3d scene
– Audio reactive camera travel, 3d displacement and image prompting logic

I’m currently in a small competitive TD cohort at DayDream to see what can be done with this new frontier of possibilities. Right now I’m just cleaning up this project so it is easier to navigate and sharable on DayDream’s project hub. Even if you don’t have a Kinect or TD recognized depth sensor, I made a demo mode where you can explore a few still images in 3d.

You can learn more about this project (and engage if you’d like to see me win) here: https://app.daydream.live/creators/basement_vibes/depth-sensor-diffusion

About The Author

You might be interested in