Transforming DALL-E 2 Generated Images Into Sound [MIDI Events] – [TouchDesigner + Ableton Live]

Through the use of OpenAI’s DALL-E 2 API and TouchDesigner, I’ve managed to create a, let’s say, MIDI sequencer that captures RGB data incoming from AI generated images in real-time, and uses it to trigger MIDI events in Ableton Live. Said signals are then feeding, among other things, two wavetables in the first two examples, and my lovely new Prophet 6 in the other ones.

In short: The idea was to have a system that, given certain concepts (a.k.a prompts), generates chord progressions from the RGB incoming data resulting from the DALL-E generated images.

Concept [prompt] ➨ AI generated image [DALL-E 2] ➨ capturing RGB data in real-time [TouchDesigner] ➨ using that data to trigger MIDI events [Ableton Live]

To access these project files, plus tutorials, and more experiments, you can head over to: https://linktr.ee/uisato

_
00:00 – Brief Walkthrough
01:20 – Example 1 – MPE wavetables
02:06 – Example 2 – MPE wavetables
02:35 – Example 3 – Prophet 6
02:58 – Example 4 – Prophet 6
03:34 – Example 5 – Prophet 6

#touchdesigner #ableton #modularsynth #ai #openai

About The Author

You might be interested in