• Login
  • Register

Work for a Member company and need a Member Portal account? Register here with your company email address.

Article

Watch this device translate silent thoughts into speech

By Eillie Anzilotti

Arnav Kapur wants to be very clear on something: What you’re about to see is not mind reading–even though it really looks like it.

AlterEgo is an AI-enabled headset device that Kapur, an MIT Media Lab researcher, has been developing for the past several years and demonstrated for the first time on stage at TED 2019 in Vancouver last month; you can now watch the video of the device in action below.

The small headset is able to detect, via powerful sensors, the signals the brain sends to internal speech mechanisms, like the tongue or larynx, when you speak to yourself. Imagine asking yourself a question but not actually saying the words aloud. Even if you don’t move your lips or your face, your internal speech system is still doing the work of forming that sentence. Your internal speech muscles like your tongue are vibrating in accordance with the words you’re thinking in ways that are very subtle and almost undetectable. “It’s one of the most complex motor tasks we do as humans,” Kapur says. AlterEgo picks up on the internal vibrations and transmits them to an AI embedded in the device, which translates them into language.

Related Content