R&B R&D is a prototype for a gestural music-making experience that facilitates embodied learning, helping users learn about properties of electronic music. By combining computation, computer vision technology, motor memory, tacit knowledge, and a “learning by doing” approach to musical experience, R&B R&D demonstrates the potential to become a powerful tool for understanding principles of electronic music. It is a platform for position- and gesture-based musical interactions; its iterations began with a collaborative dance party and evolved into a prototype for a computationally-mediated, kinesthetic, embodied learning experience.

The work explores the relationship between computationally-mediated experiences and meaning-making, emphasizing through embodied learning the making of meaning that is not solely machine-readable. r&b r&d is also an investigation of the learning process, both from a user standpoint and reflexively, becoming a study of my learning about the design of physical and gestural interfaces.

In working on R&B R&D, I learned that I could leverage computation and emerging technologies in order to facilitate learning experiences that were embodied, engaging tacit knowledge and motor-memory. I believe these musical learning experiences escape machine-readability in some way—that a “bodily way of being in sound” (Jacques-Dalcroze) is not algorithmically reproducible.