Experience a work-in-progress performance of new technology that probes the boundaries of where machine learning meets musical control and expression, created by Jordan Rudess and the MIT Media Lab Responsive Environments group. Rudess performs live with a machine learning model trained on his playing style and technique, joined on select pieces by guest violinist-vocalist Camilla Bäckman. Sometimes leading, sometimes following, the model and humans together create new and unique music that interacts in real time with a kinetic sculpture that responds to and influences the behavior of the model.
Keyboardist/Technologist: Jordan Rudess, CAST Visiting Artist
AI Music System Designer: Lancelot Blanchard, Research Assistant, Responsive Environments Group, MIT Media Lab
Installation Artist/Designer: Perry Naseck, Research Assistant, Responsive Environments Group, MIT Media Lab
Faculty Advisor: Joe Paradiso, Alexander W. Dreyfoos (1954) Professor and Director of the Responsive Environments Group, MIT Media Lab
With special guest: Camilla Bäckman, violin/vocals.
Additional project support from: Madhav Lavakare (visual mapping), Brian Mayton (mechanical design, fabrication, and installation), Carlo Mandolini (fabrication), Nathan Perry (embedded software support and installation)
Funded by the MIT Center for Art, Science & Technology (CAST), with support from the MIT Media Lab.