[ad_1]
The world is facing a maternal health crisis. According to the World Health Organization, approximately 810 women die every day from preventable causes related to pregnancy and childbirth. Two-thirds of these deaths occur in sub-Saharan Africa. The main cause of maternal death in Rwanda is cesarean section injuries.
A team of doctors and researchers from MIT, Harvard University and Rwanda Partners in Health (PIH) have proposed a solution to this problem. They have developed a mobile health (mHealth) platform that uses artificial intelligence and real-time computer vision to predict infection in C-section wounds with approximately 90 percent accuracy.
“Globally, early detection of infection is an important issue, but in low-resource areas like rural Rwanda, the problem is even more serious due to the lack of trained doctors and the prevalence of antibiotic-resistant bacterial infections,” he says. Richard Ribon Fletcher ’89, SM’97, PhD ’02, research scientist in mechanical engineering at MIT and technology lead for the team. “Our idea was to employ mobile phones that community health workers could use to visit new mothers at home and check their wounds to identify infections.”
This summer, the team, led by Bethany Heidt-Gauthier, a professor at Harvard Medical School, was awarded a $500,000 first-place prize. NIH Technology Rapid Challenge for Maternal Health.
PIH team member Frédéric Catera added: “In the developing world, the lives of women who give birth by caesarean section are affected by the lack of access to both quality surgery and postnatal care. “The use of mobile health technologies for early detection of surgical site infections in these communities is a scalable game changer in advancing women’s health.”
Training algorithms to identify infections
The project’s inception was the result of a series of coincidences. In the year In 2017, Fletcher and Heidt-Gauthier clashed in metro Washington at an NIH investigator meeting. At the time, Hedt-Gauthier, who had been working on research projects in Rwanda for five years, needed to find a solution to address the gaps in cesarean care that she and her colleagues encountered in their research. She was particularly interested in investigating cell phone cameras as investigative tools.
Fletcher, who leads a team of students at Professor Sanjay Sarma’s AutoID Lab and has spent decades applying phones, machine learning algorithms and other mobile technologies to global health, was a natural fit for the project.
“After realizing that these types of image-based algorithms could support home care after cesarean delivery, we approached Dr. Fletcher as a collaborator based on his extensive experience in developing mHealth technologies in low- and middle-income settings,” says Hedt-Gauthier.
During the same trip, Hedt-Gauthier sat next to Audas Nakashimana ’20, a freshman MIT student from Rwanda who would later join Fletcher’s team at MIT. Mentored by Fletcher, he founded Nakeshimana in his senior year InsitiveA Rwandan startup applying AI algorithms to clinical imaging was a top grant winner in the 2020 annual MIT IDEAS competition.
The first step of the project was to collect a database of images of wounds taken by community health workers in rural Rwanda. They collected images of more than 1,000 infected and uninfected wounds and then trained an algorithm using that data.
A central problem emerged with this first set of data collected between 2018 and 2019. Many of the photographs were of poor quality.
“The quality of the wound images collected by healthcare professionals was highly variable and required a significant amount of manual labor to crop and reframe the images. As these images were used to train the machine learning model, the image quality and variability fundamentally limited the algorithm’s performance,” says Fletcher.
To solve this problem, Fletcher turned to tools he had used in previous projects: real-time computer vision and augmented reality.
Real-time image processing to improve image quality
To encourage community health workers to take high-quality images, Fletcher and his team developed a wound display mobile app and combined it with a simple paper frame. The frame contains a printed calibration color pattern and another optical pattern that guides the application’s computer vision software.
Health workers are instructed to place the frame over the wound and open the app, which provides real-time feedback on the camera’s position. Augmented reality is used by the app to display a green signal when the phone is in the correct range. After ranging, other computer vision software components automatically adjust the color, crop the image, and apply transformations to correct parallax.
“Using real-time computer vision as data is collected, we can create beautiful, clean, uniformly colored images to train our machine learning models without any manual data cleaning and post-processing,” says Fletcher.
Using convolutional neural net (CNN) machine learning models, along with a technique called transfer learning, the software was able to successfully predict infection in C-section wounds within 10 days of delivery. Women suspected of having an infection by the app are referred to a clinic for diagnostic bacteriology and life-saving antibiotics may be prescribed as needed.
The app has been well received by women and community health workers in Rwanda.
“Women’s trust in community health workers, who are the main promoters of the app, means that the mHealth tool has been adopted by women in rural areas,” says Anne Nyigena of PIH.
Using a thermal image to solve the algorithm bias
One of the biggest hurdles in bringing this AI-based technology to a global audience is algorithms. When trained on a relatively homogeneous population such as rural Rwanda, the algorithm performs as expected and can successfully predict infection. But when the images of patients with different skin color are shown, the algorithm is less effective.
Fletcher used thermal imaging to solve this problem. Simple thermal camera modules designed to attach to a cell phone cost about $200 and are used to capture infrared images of lesions. An algorithm can be trained using thermal patterns of infrared wound images to predict infection. A Research Published last year, when these thermal images were combined with the app’s CNN algorithm, it showed more than 90 percent prediction accuracy.
Although more expensive than using the phone’s camera, the thermal imaging approach could be used to bring the team’s mHealth technology to a diverse global population.
“We are giving the health workers two options: in a population, like in rural Rwanda, they can use their standard phone camera to use a model trained from local population data.” Otherwise, you can use a more general model that requires the thermal camera’s connectivity,” says Fletcher.
While the current mobile app uses a cloud-based algorithm to run the infection prediction model, the team is now working on a standalone mobile app that does not require internet access and addresses all aspects of maternal health. From pregnancy to postpartum.
In addition to developing a library of wound images for use in the algorithm, Fletcher is working closely with former student Nakeshimana and the Insitive team on the development of the app, which uses Android phones manufactured locally in Rwanda. PIH then conducts user testing and field-based validation in Rwanda.
Privacy and data protection are top priorities as the team seeks to develop a comprehensive app for maternal health.
“When developing and testing these tools, patient data privacy should be given more consideration. Additional data security details should be included to address the tool’s perceived vulnerability and increase user confidence, which will ultimately benefit greater adoption,” says Nyigana.
Members of the award-winning team include: Bethany Hedt-Gauthier from Harvard Medical School; Richard Fletcher from MIT; Robert Rivillo from Brigham and Women’s Hospital; Adeline Boatin from Massachusetts General Hospital; Anne Nyigena, Frederick Katera, Laban Bikorimana and Vincent Kubaka from PIH in Rwanda; and Audace Nakeshimana ’20, founder of Insightiv.ai.
[ad_2]
Source link