Home          Projects          Systems          Info

MediaPipe Finger Tracking


TouchDesigner




What is MediaPipe?
MediaPipe is an open-source machine learning framework developed by Google that detects and tracks the human body in real time through a camera feed. It can map the face, hands, and full-body skeleton as a live stream of data points, capturing everything from subtle facial expressions to the position and gestures of each finger.

How does it work/apply to TouchDesigner?
MediaPipe plugs into TouchDesigner as a pipeline for a source of body tracking data. It reads from the camera feed and outputs the position of key points on the body as a continuous stream of coordinates. Inside TouchDesigner, those coordinates become something you can wire into anything: the scale of a shape, the color of a texture, the intensity of a light. Every movement of your hand or shift of your head is translated in real time into a visual response. 


I created and tested this finger-tracking effect to explore how movement can trigger unique visual responses. The goal was to understand whether this interaction could offer a new way of perceiving the environment. Specifically, how the beams might reveal and emphasize different textures in ways that wouldn’t be noticed otherwise.



I used the output data from the hand-tracking system to isolate the coordinates of each fingertip, mapping a “beam” to every finger node. The beams themselves were created by designing a simple triangular form in Photoshop, exporting it as a transparent PNG, and importing it into TouchDesigner. Within TouchDesigner, each beam was given its own processing chain—including level, noise, threshold, and feedback operators—to control contrast, harmonics, black points, and other visual qualities. As a result, each beam develops a distinct visual style.