Motion Painting

流彩
Interactive real-time particle system responding to body movement
Category
Interactive digital media
Location
NYU ITP, New York, Us
YEAR
2023/11

Statement

Tools

Credit

Project DEscription

Overview >

Motion Painting is a real-time interactive artwork that visualizes bodily movement as generative particle compositions. A webcam captures subtle motion, which is translated into dynamic particle systems that swirl, drift, and react across a projected frame. Every viewer becomes an active agent in the creation process—each gesture triggering colorful turbulence in an otherwise quiet field.

Inspiration >

The work draws conceptual and aesthetic inspiration from Refik Anadol’s Unsupervised (MoMA, 2023), where machine learning models created mesmerizing data-driven animations. Yet while Anadol’s work leans on opaque technologies and pre-rendered sequences, Motion Painting foregrounds accessibility and transparency. It invites curiosity not only in its visuals but also in how the system works—and how others might recreate or remix it.

future steps >

This project represents an early but critical step in my exploration of digital materiality and open-source methods for real-time generative art. Future iterations aim to incorporate depth sensors, custom-trained ML models, and alternative real-time environments such as TouchDesigner.
Technical Description

Processing + Spout >

Motion Capture (Processing): A custom Processing sketch uses optical flow analysis to detect movement through a webcam feed. Significant motion areas spawn vibrant particles in real time. The sketch produces a dynamic media texture (video frame) representing motion density.

Texture Transfer (Spout): The visual output from Processing is transmitted directly to Unreal Engine via Spout, a real-time GPU video sharing protocol. This enables seamless, low-latency texture streaming between applications.

UE + Madmapper >

Particle Simulation (Unreal Engine): The incoming video texture is mapped to a Niagara particle grid within Unreal Engine. Particles are influenced by turbulence, bounded physics collisions, and color parameters derived from the motion texture. This results in a richly rendered, physics-driven particle field.

Projection & Display: The Unreal Engine output is routed to MadMapper for projection mapping. In the exhibition, an ultra short-throw projector displays the scene onto a framed wall, creating an immersive and intimate surface for interaction.