Animations designed with the help of a pre-engineered GPT are coded further by the AI to connect with a wearable sensor. The result allows a naive user to describe an animation to the GPT in English, then resubmit the animation code to receive a fully executable program that moves along with their skin conductance when wearing a wearable. Current experiments examine the efficacy of this vs. standard biofeedback interfaces (such as skin conductance or photoplesthmography graphs) for increasing interoception and potentially treating alexithymia.