国际医疗器械设计与制造技术展览会

Dedicated to design & manufacturing for medical device

September 25-27,2024 | SWEECC H1&H2

EN | 中文
   

What If a Machine Could Read Your Brain Signals for Pain?

 

 

At some point or another most of us have been asked to rate our pain on a scale of zero to 10 during a visit to the doctor or hospital. But what if a patient is unconscious or noncommunicative and unable to rate their own pain? What if there was a piece of technology capable of communicating a patient’s pain level for them by analyzing brain activity?

 

 

Researchers at MIT and elsewhere have developed just such a system. The portable device leverages an emerging neuroimaging technique called functional near infrared spectroscopy (fNIRS) in which sensors placed around the head measure oxygenated hemoglobin concentrations that indicate neuron activity. The researchers describe the method, and how it could be used to quantify pain in patients, in a paper presented at the International Conference on Affective Computing and Intelligent Interaction.

For their work, the researchers use only a few fNIRS sensors on a patient’s forehead to measure activity in the prefrontal cortex, which plays a major role in pain processing. Using the measured brain signals, the researchers developed personalized machine-learning models to detect patterns of oxygenated hemoglobin levels associated with pain responses. When the sensors are in place, the models can detect whether a patient is experiencing pain with around 87% accuracy.

“The way we measure pain hasn’t changed over the years,” said Daniel Lopez-Martinez, a PhD student in the Harvard-MIT Program in Health Sciences and Technology and a researcher at the MIT Media Lab. “If we don’t have metrics for how much pain someone experiences, treating pain and running clinical trials becomes challenging. The motivation is to quantify pain in an objective manner that doesn’t require the cooperation of the patient, such as when a patient is unconscious during surgery.”

Traditionally, surgery patients receive anesthesia and medication based on their age, weight, previous diseases, and other factors. If they don’t move and their heart rate remains stable, they’re considered fine. But the brain may still be processing pain signals while they’re unconscious, which can lead to increased postoperative pain and long-term chronic pain. The system could provide surgeons with real-time information about an unconscious patient’s pain levels, so they can adjust anesthesia and medication dosages accordingly to stop those pain signals, the researchers noted.

Traditional fNIRS systems involve sensors that are placed all around a patient’s head. The problem with that is the amount of time it takes to set up, and it’s not feasible for patients undergoing surgery, the researchers say. That’s why they adapted the fNIRS system to specifically measure signals only from the prefrontal cortex. They further explained that even though pain processing involves information from multiple regions of the brain, studies have shown the prefrontal cortex integrates all that information, so it’s really only necessary to focus on the forehead.

Another problem with traditional fNIRS systems is they capture some signals from the skull and skin that contribute to noise. To fix that, the researchers installed additional sensors to capture and filter out those signals.

On the machine-learning side, the team trained and tested a model on a labeled pain-processing dataset they collected from 43 male participants. Going forward, the researchers plan to collect a lot more data from diverse patient populations, including female patients (both during surgery and while conscious, and at a range of pain intensities) in order to better evaluate the accuracy of the system.

Click here to read more about how the researchers trained and tested the system. In addition to Lopez-Martinez, the team included: Ke Peng of Harvard Medical School, Boston Children’s Hospital, and the CHUM Research Centre in Montreal; Arielle Lee and David Borsook, both of Harvard Medical School, Boston Children’s Hospital, and Massachusetts General Hospital; and Rosalind Picard, a professor of media arts and sciences and director of affective computing research in the MIT Media Lab.

SOURCE:MDDI

X