2025-09-14 –, Ballroom 3
In this presentation, we will demonstrate how we use Python as a bridge to facilitate a live music concert. Using computer vision techniques with libraries such as OpenCV and MediaPipe, we detect various animal plushies – including koalas, whales, otters, octopuses, and Blåhaj – each mapped to their own unique sonic identities.
The position and movement of these plushies in space are tracked to control sound parameters such as panning, pitch, reverb level, and tempo modulation, thereby creating a playful, gesture-based system for musical expression, blending tangible interaction with digital sound synthesis.
We will walk through how we’ve applied Python for both networking (via OSC) and computer vision, and share why Python’s readability and strong ecosystem made it the perfect fit for fast prototyping and real-time control. We’ll also touch on our creative process, including mapping physical toys to digital instruments, and ensuring smooth, latency-free performance.
The presentation will close with a five-minute interactive concert, where the audience will experience how plushie placement and movement generate evolving soundscapes in real time. We hope to inspire others to explore Python as a tool not just for logic and data, but also for creative expression.
Anneysha is a Computer Science student at the Australian National University, currently in her Honours year. She is passionate about accessible and inclusive HCI design.