Post

Guitar Machine

Copyright

Sang-won Leigh

Sang-won Leigh

We often imagine a future world filled with robots and artificial intelligence agents, where humans share their civilization with sentient beings made out of metals and plastics, that function much like we do. But we could also imagine robots that take the form of existing objects, such as tools, furniture, or vehicles. As an example, consider driving assistance features in modern cars, which offload certain driving responsibilities and automate them, creating a more convenient experience for the human driver.

Guitar Machine investigates such a possibility in the context of musical instruments. It is a “smart” or robotic attachment for a guitar that allows a human artist and a robot to share the act of playing the instrument. Of course, automation has long been part of music history—from the classic player piano to robotic heavy metal band (Z-machines). However, most of these existing apparatuses revolve around automation, relying on machines to take the place of human endeavor, in one of the prime domains of human creativity.

That divisive view between us and technology has always bothered me. To me, the division seems arbitrary, while the roles of a human artist and an instrument (a technology) have always been more synergistic (or even symbiotic) than anything.

“…I am more and more convinced that some of the essential structures of music are rooted in the body…the very close relationship which exists between ‘dancing’ with the body and producing musical sounds as a result of this activity…” —“Music and the Body,” John Baily

There is another fascinating vertical to consider, which is a more free-form, avant-garde movement in music. Musicians have long tried new ways to use specific instruments, or invented new instruments and tools for unexplored inspirations. It is also well known in ethnomusicology that an instrument governs the shape of the music played on it—not only the sonic characteristics of the instrument, but what musicians are inclined, or even able, to do with it, from a very basic motor level.

.

With that in mind, how can new instruments be made with today's technology, and how can that defamiliarize the act of playing the guitar—thus inventing new ways of interacting with it? Guitar Machine is a guitar loaded with a variety of robotic components. It negotiates between conventional ways of playing the guitar and newly afforded, unfamiliar sound-making. Unlike many other automated instruments, Guitar Machine is not designed to replace the musician. The hammering, bowing, tremolo picking, or electrically driven actuator components are controlled by the musician herself as an extension of her fingertips, merging the very natural human process of playing music with the computational power of machines.

.

A number of musicians have tried Guitar Machine, from music professors to jazz students, through sound effects designers, and more. During those trials, I and my collaborator Abhinandan found that the robot helped them explore alternative sonic and musical spaces, effectively train complex musical patterns on the guitar, or even improvise along with the instruments. The musicians say the sounds being created become more unexpected, helping them explore new avenues.

One jazz guitarist “improvised” with the guitar, as it played pseudo-randomized patterns of beats. As their duet progressed, he shouted, “It pretty much feels like playing with another person!” I was watching him, and it was much more than playing in tandem with the machine. He was seamlessly transitioning between giving the robot the lead, taking over control, and synthesizing his own playing with the robot’s once he understood what the robot was doing.

A sound effects designer, who is also a guitarist, tried “pushing the system to extremes,” turning on more actuators than it was designed to run at one time. As a result, the motors started moving at unexpected speeds and hitting at unintended moments. It sounded beautiful and in a sense poetic, as he was just letting the machine do machine things.

Another guitarist tried training a polyrhythm pattern, with two wildly different beat patterns going on at the same time. This kind of polyrhythm is unintuitive, since the musician has to divide measures in two different ways and execute the rhythms concurrently. We automated the rhythm using Guitar Machine, and he had a short, three-minute session of “feeling it,” after which he was able to play the pattern successfully. He said it would normally take at least 30 minutes to get to this level of flow; in this case, he was able to gain the experience of being able to play in a fraction of that time.

.

The Guitar Machine not only showcases future applications for computation and robotics in creative domains. It also highlights how the intimate relationship between tools and humans can be extended or reshaped. As mentioned earlier, it is well known that the shape of an instrument plays a critical role in shaping the structure of music played on that instrument. The robotic symbionts of Guitar Machine push this notion further by weaving computation into the equation.

We encounter new technology in every aspect of our lives. As much as the world constructs the basis of our thinking, now that we more and more design the world for ourselves, we desire wondrous encounters with technology. There is tremendous potential as well as fear around how automation and artificial intelligence may change our societies. Although there have been numerous examples of synergy between human and machine, many of today’s technological innovations come at the expense of machineries claiming agency from what has always belonged to us. We boldly argue that a more organic and improvisational relationship with technology is necessary. Our design attempts to forge a visible path towards that, instead of suggesting convenience at the expense of human agency. Guitar Machine rather proposes a handshake to its human colleagues, through which they can explore alternative spaces.

.

My research practice engages technology with creative domains. We are planning to continue the work on Guitar Machine, using it in musical performances and making newer versions. An important next step will be to apply machine learning to understanding the musicians’ intention on the guitar, to make the guitar smarter. I also work on machines for painting or drawing— visual expression—and the creation of more tools for artists will follow.

If you have questions please contact Sang Leigh (leigh@sangww.net) or my colleague Abhinandan Jain (abyjain@media.mit.edu). We are open to any form of collaboration and tryouts in all creative domains. Please refer to my website (www.sangww.net) for more information on my works.

Related Content