Gesture Studies
Current ProjectsHand tracking and gesture recognition for musical performance
Hand tracking and gesture recognition for musical performance. As the performer shapes gestures with their hands, each shape is visually highlighted and stored in the background along a timeline, while simultaneously triggering a sample via MIDI. This project explores the intersection of computer vision and musical performance. The samples used in the performance were generated using the genPC app, creating a full-circle integration between generative audio and gesture-based control.
Highlights
- Hand Tracking: Computer vision algorithms track hand positions and movements in real-time
- Gesture Recognition: Machine learning models identify specific hand shapes and gestures
- MIDI Triggering: Each recognized gesture triggers a corresponding sample via MIDI
- Score Generation: All gestures and timing are recorded to create a reproducible musical score