Project

Emonator

The Emonator project is aimed at affective information browsing and exchange. We are using an innovative gesture controller as our input interface, and music/video as our media. The applications include real-time sound synthesis, MIDI control, live video browsing, and more. The multipurpose gesture interface allows for interactive control of music and video, and is controlled by hand gestures. The device, which consists of 144 rods arranged in a square pattern, derives its position from 12 optical boards, and is controlled by a field programmable gate array. The Emonator is also the first input interface for the Emonic Environment project. The hardware for the Emonator project was collaboratively designed by Dan Overholt, a recent master's candidate in the Lab's Opera of the Future group, and Andrew Yip, who constructed the interface.