Post

Center for Bits and Atoms alum Rich Fletcher is tech lead for team that top prize from NIH for using AI to predict wound infection

Copyright

Photo courtesy of Partners in Health.

Photo courtesy of Partners in Health.

Using AI to Predict Wound Infection Helps Win $500K Prize from NIH

Copyright

Courtesy of the researchers

Technology accelerator award recognizes innovation for low-resource settings to support maternal health

The National Institutes of Health has awarded the $500K first prize in its 2022 Technology Accelerator Challenge (NTAC) to an interdisciplinary team of doctors and researchers from MIT, Harvard, and Partners in Health for their groundbreaking work in addressing surgical site infections (SSI) and maternal health.  Richard Ribon Fletcher, a research scientist in the MIT Department of Mechanical Engineering, and an alum of the MIT Center for Bits and Atoms, is the technology lead for the nine-person team, which has been developing an AI platform to detect infection in Rwandan women who give birth by Cesarean section.

“Early detection of infection is an important issue worldwide, but in low-resource areas such as rural Rwanda, the problem is even more dire due to a lack of trained doctors and the high prevalence of bacterial infections that are resistant to antibiotics,” says Dr. Fletcher. “Sadly, wound infection is one of the leading causes of maternal mortality in Rwanda. Our idea was to employ mobile phones that could be used by community health workers to visit new mothers in their homes and inspect their wounds to detect infection.”

The NTAC prize was established by the National Institute of Biomedical Imaging and Bioengineering (NIBIB) is 2020 to help accelerate the application of technology innovation in health diagnostics. This year’s prize was dedicated to maternal health, which is a major global health problem, resulting in the deaths of 800 women and 7,000 newborns each day. 

In addition to Dr. Fletcher at MIT, the interdisciplinary nine-person team is led by Bethany Hedt-Gauthier, a professor at Harvard Medical School, Division of Global Health, and also includes Robert Riviello, at Brigham and Women’s Hospital; Adeline Boatin at Massachusetts General Hospital; and Anne Niyigena, Frederick Kateera, Laban Bikorimana, and Vincent Cubaka from Partners In Health (PIH) in Rwanda. The final member of the team is Audace Nakeshimana, one of Dr. Fletcher’s former students, who is helping to scale up the technology through the AI start-up company he founded, called Insightiv.ai.

Dr. Fletcher leads a student team within Professor Sanjay Sarma’s AutoID Lab that applies machine learning algorithms, phones, and other mobile technologies to global health as well as mental health. Fletcher has been working in global health at MIT for over 20 years, with academic appointments at Massachusetts General Hospital and Harvard Medical School.

Applying AI and Smart Phones for Detecting Wound Infection 

Dr. Fletcher and Dr. Gauthier first met in 2017 at an NIH investigator meeting, where each of them was presenting work applying mobile phone solutions for global health (mHealth). At the time, Dr. Gauthier was exploring the use of questionnaires administered on a mobile phone, but was interested in exploring the use of the phone camera as a diagnostic tool. Dr. Fletcher had experience implementing computer vision algorithms on smartphones, and offered to lead that work.  Dr. Fletcher and Dr. Gauthier’s team have since been collaborating on several NIH-funded grants to help address the problem of wound infection and maternal mortality in Rwanda.

The first research study on this topic, in 2018–2019, made use of wound images collected by community health workers in Rwanda using a mobile tablet in the patient’s home. (Figure 1)

Copyright

Courtesy of the researchers

“Our first research study was a huge learning experience,” says Dr. Fletcher. “The quality of wound images collected by the health workers was highly variable and it required a large amount of manual labor to crop and re-sample the images. Furthermore, our early work required extracting a significant number of hand-crafted features from the images, which was also very time-consuming and was not practical for in real-world clinical use.”

After studying the problem, Dr. Fletcher proposed the use of real-time computer vision and augmented reality, which he had been using for other health physiology projects.  Dr. Fletcher’s team developed a computer vision target pattern comprised of a thick flexible frame (Figure 2) that is placed over the wound.  With this inexpensive printed frame, the mobile phone software is able to track the wound, crop the image, and automatically correct for parallax, rotation, and geometric distortion.  The printed frame also includes a color chart, which is also employed by the phone software to automatically balance the color of the image, to correct for variations in ambient lighting. 

Copyright

Courtesy of the researchers

“By using real-time computer vision at the time of data collection, we are able to generate beautiful, clean, uniform, color-balanced images that can then be used to train our machine learning models, without any need for manual data cleaning or post processing,” says Dr. Fletcher. “With this mobile software, our latest machine learning models have been able to successfully predict infection with an accuracy approaching 90%, using mobile phone images taken 10 days after surgery.”

Addressing Algorithmic Bias and Skin Color

While planning how this solution could be scaled and translated to other parts of the world, Fletcher’s technical team recognized a problem with the algorithm, which is that there is some dependence on the patient skin color.  While the algorithm works well within a homogeneous population, such as rural Rwanda, the algorithm would be less effective in regions where there exists a significant variation in the patient skin color.

To address this problem, Dr. Fletcher proposed the use of thermal imaging using the mobile phone, which he had developed previously for automated measurement of patient physiology such as respiration. Using a thermal camera attachment for mobile phones, which are now available for less than US $200, the thermal camera wound image could then be analyzed with custom machine learning algorithm to produce a prediction of infection.

Research by the study team using thermal images, published last year at the annual IEEE Engineering in Biomedicine Conference, demonstrated a prediction accuracy over 90% using a convolutional neural net (CNN) algorithm and transfer learning.

“For reasons of cost and simplicity, we will try using the built-in phone camera as much as possible,” says Dr. Fletcher, “But the thermal camera solution can be used in places where the patient community skin color is not homogenous.”

Field Deployment and Scaling the Technology

As part of the latest research funded by NIH, Dr. Fletcher’s technical team has developed a fully integrated mobile app for Android that includes all the computer vision and image processing algorithms, as well as on-device machine learning models implemented using Tensorflow Lite.  This level of integration is needed in areas where the Internet is not available, and it is not possible to connect to a remote server.

The latest version of the mHealth wound screening tool also includes real-time image quality feedback for the community health worker, which alerts the health worker to immediately retake the image if the quality of the image is too poor.

In addition to the interdisciplinary team of doctors and researchers in Boston and Rwanda working to validate and field test the project, to further support the technology in Rwanda, one of Dr. Fletcher’s former students, Audace Nakeshimana, founded Insightiv, a startup company that is applying AI algorithms for analysis of clinical images.  Insightiv was a top prize winner at the 2020 annual MIT IDEAS competition and is now one of the research partners in the current NIH-funded clinical studies. Nakeshimana is also part of the nine-person team receiving the NTAC award.

The use of phones for public health in Rwanda is further supported by the fact that the government of Rwanda has invested in its own Android phone manufacturing facility in Rwanda, and they have produced Africa’s first smartphone, called the “Mara phone.”

“The computational power of this little device in your pocket is truly amazing,” says Dr. Fletcher, “and I am very grateful and honored to be working with such a talented team of doctors and researchers both here in Boston and in Rwanda to fully bring this technology to the hands of the people who need it.”

======================

Project web site: https://mobilehealthlab.com/portfolio/predicting-infection/

Related Content