MediaPipe Finger Tracking
TouchDesigner
MediaPipe is an open-source machine learning framework developed by Google that detects and tracks the human body in real time through a camera feed. It can map the face, hands, and full-body skeleton as a live stream of data points, capturing everything from subtle facial expressions to the position and gestures of each finger.
How does it work/apply to TouchDesigner?
MediaPipe plugs into TouchDesigner as a pipeline for a source of body tracking data. It reads from the camera feed and outputs the position of key points on the body as a continuous stream of coordinates. Inside TouchDesigner, those coordinates become something you can wire into anything: the scale of a shape, the color of a texture, the intensity of a light. Every movement of your hand or shift of your head is translated in real time into a visual response.
I created and tested this finger-tracking effect to explore how movement can trigger unique visual responses. The goal was to understand whether this interaction could offer a new way of perceiving the environment. Specifically, how the beams might reveal and emphasize different textures in ways that wouldn’t be noticed otherwise.