Project

Patchbots: Milli-fluidic Soft Robots Towards On-body Locomoting Swarm UIs

Ozgun Kilic Afsar

Patchbots are sheet-form robots that are programmable on-the-fly by direct manipulation through bodily movements. The integrated soft mechanism comprises sub millimeter and millimeter scale hybrid fluidic fiber muscles actuated by miniature pumps. With multi-segmented resistive sensors patterned on each fiber module, the robots perceive their own dynamic geometry as well as the body they wrap around.  Using this dynamic geometry data, patchbots are able to respond to the user’s movements in real time and can be applied for kinesthetic learning activities,  locomoting on-body haptics and dynamically re-locating assistive technologies. 


Contributors

Person People
Ozgun Kilic Afsar
Research Assistant
Past Member
Person People
Hila Mor
Past Member
Person People
Ken Nakagaki
Former Research Assistant
Person People
Hiroshi Ishii
Jerome B. Wiesner Professor of Media Arts and Sciences; Associate Director, MIT Media Lab

External Contributors

Seunghee Jeong, Wyss Institute for Biologically Inspired Engineering, Harvard
Martin Eric William Nisser, CSAIL, MIT

External Advisors 

Prof. Kristina Hook, KTH Royal Institute of Technology
Prof. Klas Hjort, Uppsala University

Vision

Our concept is inspired by two models in HCI;  intimate correspondence (Ingold) and co-adaptativity (Mackay). These models suggest an immediate and synchronized feedback loop in the interaction between the user and technology, where control is shared and dynamically shifting between the two agents.

What would it be like to have a dress that keeps on dancing when we take it off? 

To realize our vision, we foresee an intuitive paradigm for programming robotic materials; one that is closer to a form of craft, like a chef kneading his dough or a sculptor sculpting his clay. 

We propose an embodied programming of our everyday objects and environment through directly interacting with them via our bodily movements.

Integrated Patch Design + Fabrication


As a first step to realizing our concept, we started building a flexible and stretchable robotic patch that can attach and adhere to arbitrary soft bodies due to its compliant form.

We leverage spray deposition of custom localized resistive sensor nodes for each fiber muscle that constitutes the patch. For the fabrication of the sensor, we prepared a carbon black solution and applied layers of 30μm thickness with air-brush on the surface of the elastomer tubing. 


The main parameter that we tune in the actuator design is the variable stiffness of the ultra-thin shell encapsulating the fiber muscle. "Stiffness patterning" on the shell  caters for fabricating pre-programmed movements such as single-curve bending, multiple-curve bending and curling. 

Stiffness patterning can be done by coating or embedding a non-elastic layer of material in the desired part of the soft shell. In our case, we inlayed a non-elastic thread along the axis of choice. 

By using a combination of different stiffness-patterned fibers, one may program complex dynamic behaviors in a hybrid patch assembly, by performing the action with the robotic material.


Design Space

We are building patchbot’s mechanism so that it affords to record physical motion and dynamically respond to it. The reconfigurable design of the patch enables manifold morphing behavior and locomotion, while the ultra-thin sheet form makes the robot adherent to different parts of the human body. 

Concept: An On-body Locomoting Fabric Patch Assembly

As a first implementation, we wanted to make a soft robot that can navigate on the human body by sequential grip and extend motion which allows it to freely locomote on human limbs

The climbing patchbot consists of three modules; an extensible body and two pairs of semi-patterned limbs. We successfully fabricated the separate modules and their modular assembly with 3D printed connector parts. 

Current status: We are doing an analytical study on the required gripping force for different parts of the human body. When larger gripping forces are required the number of limbs can be multiplied and the assembly type of the fibers can be varied (e.g. braided fiber muscles).

Possible Applications + Future work

We envision our programmable patches working collectively to record and reconstruct movement data, add dynamic capability to passive objects, and enhance co-located and remote haptic communication for collaborative work and play. 

Our larger vision of robotic patches is that they become part of our everyday clothing seamlessly, like a soft exoskeleton with the kinetic memory of our ever-changing bodies in motion. Here we illustrate potential application scenarios to discuss how our interface opens up possibilities for the future of distributed robotic material for full-body interactions. 

Kinesthetic learning activity

During our previous studies with opera singers, we have found that articulating embodied and tacit knowledge of singing activity is challenging with only symbolic language. 

What would it be like to have a corset that dynamically fits the unique ways our breathing muscles move when we sing? 

We envision a kinesthetic teaching-learning scenario for vocal training; by capturing the movement of the instructor's breathing muscles and physically replaying on the learner's body via distributed robotic patches. 

Co-located / remote haptic communication

Patchbots can be utilized in co-located and remote haptic interactions such as augmenting inanimate objects with dynamic function or altering their already existing autonomous motion. Our example application is wearable robotic patches for drone tele-operation scenario; where the drone operator physically experiences the flight condition (e.g. wind, obstacles) of the drone directly on her body via kinesthetic feedback from the patches for a more intuitive control of the drone.

Another potential use case is haptic communication for body in extreme environments where proprioceptive feedback can be crucial for safety; such as a haptic interface embedded in a deep-diving suit for proprioceptive coordination in scientific ocean exploration and a non-verbal communication channel for co-located groups of recreational divers.

Dynamic physical assistants

Our larger vision for robotic fibers and patches that constitute patchbots, is that they seamlessly integrate to our everyday wearables and surfaces. To demonstrate this vision, we will fabricate a collective of patchbots functioning as a swarm UI assisting their user in finding balance and posture for daily bodily activities like yoga, movement tracking during sleep,  micro facial movement tracking e.g. jaw-clenching as a marker of stress.


Frequently Asked Questions

  1. What is the general state of this research?

    This is an early stage research project, started in January 2020 at the Media Lab. The PI of the project is a Visiting PhD student from KTH, Royal Institute of Technology. This project is a continuation of previous research at her home university about on-body and movement based HCI, currently being conducted with collaborators at MIT Media Lab and Harvard Wyss.

  2. Where is this project being developed?

    Due to COVID-19 lockdown, most of the hardware have been developed and fabricated at a home lab setup by the project PI. The sensor fabrication has been done in a separate ventilated outdoor setup due to risks of working with nano-powders and aerosols in living environment.

  3. What are the dimensions of the robot?

    The robots have a modular construct where each patch submodule is a 1 x 2 inch (~2.5 cm x 5cm) multi-fiber assembly. Each fiber muscle, that forms the patch, has an outer diameter of 1/16 inch (~1.6 mm). This is a scalable process, thus dimension of the robots can simply vary.

  4. How do you control the robot? Are they untethered?

    Our patchbots are currently controlled by an off-the-shelf platform and connected miniature pumps. Our plan, however, is to provide them with more mobility and autonomy. In fact, for our most recent on-body navigating robot implementation, we coupled our robots with a wearable miniature pneumatic control board, FlowIO, developed by Ali Shtarbanov at the Lab.

  5. How many robots can you control at the same time?

    Currently we have only two working patchbots which are controlled separately. We aim for having ~10 functional robots to be simultaneously controlled.

  6. How is this different than other wearable soft robots?

    Our robots differ with their versatile sheet-form factor which makes them applicable on arbitrary bodies. However, we strongly believe that what separates them from most robotic material applications is their on-the-fly "programmability" with direct movement and performed actions of the user.

  7. What are the next steps?

    We are currently working on our multichannel sensor circuit for dynamic shape awareness of the robot. Our next major steps are the sensor-actuator integration for on-the-fly programming, and configuring collective robot communication network.

  8. Who should I contact to get further information?

    For further questions, please send an email to Ozgun <email: ozgun@media.mit.edu>. You are also welcome to request a meeting with us through our Google Form.