This project showcases an interactive 2D particle system built in TouchDesigner.
When the user clicks on the canvas, particles spray outward — their movement driven by directional noise.
The rendered output is then processed through StreamDiffusionTD, transforming the particle motion into rich, evolving AI-generated visuals.
The interaction layer can be switched from mouse control to alternative input methods such as MediaPipe, Leap Motion, or Kinect, enabling real-time interactive installations.
StreamDiffusionTD excels in this setup — as the abstract nature of the input allows the AI to adapt fluidly to multiple visual prompts and interpretations.
Creating fluid, abstract visuals that bridge human interaction and AI imagination — where every click becomes a burst of motion and meaning.

Turn simple contour lines into glossy obsidian and oozing magma! Press 1 to drive the noise movement. Hold 2 to move upwards. Hold 3 to lower contour line frequency.

This project file is a simple reaction diffusion network that gets transformed using SDXL into graffiti on a wall. Bold colors, and cool shapes! Press 1 to reset the reaction diffusion.

Public launch of the most advanced Stable Diffusion model - SDXL, into a highly controllable, open-source, real-time workflow