SELECT LANGUAGE BELOW

UK report reveals bias within medical tools and devices | Health

Ethnic minorities, women and people from disadvantaged communities are at risk of receiving poorer healthcare due to bias within medical tools and devices, a new report has found.

Among other findings, ‘Equity in Medical Devices: An Independent Review’ raises concerns about devices that use artificial intelligence (AI) or measure oxygen levels.

The team responsible for the review said urgent action was needed.

“We take an unbiased view of the entire medical device lifecycle, from initial testing, to patient recruitment in the hospital or community, to early-stage research participation, and post-approval deployment in the field. Professor Frank Key, director of the Center for Public Health at Queen’s University Belfast and co-author of the review, said:

Deputy Health Minister Andrew Stevenson said: ‘Ensuring our health system works for everyone, regardless of ethnicity, is paramount to our values ​​as a nation. “We will support our wider work to create a simpler NHS,” he said.

The government-commissioned review was set up by Sajid Javid in 2022, when he was health secretary, after concerns were raised about the accuracy of pulse oximeter readings in black and ethnic minority populations.

The widely used device gained attention due to its importance in medicine during the COVID-19 pandemic, where low oxygen levels are a key sign of serious illness.

A new report confirms concerns that pulse oximeters overestimate the amount of oxygen in the blood of dark-skinned people, and found there was no evidence that this would affect NHS care, but in the US Such biases lead to delays in medical care, where black patients are not only diagnosed and treated, but also suffer from worsening organ function and even death.

The researchers stress that they are not asking people to avoid using the device. Instead, this review proposes a number of measures to improve the use of pulse oximeters in people of different skin colors, including This includes the need to focus on change, and provides advice on how to develop and test new devices. This is to ensure that it works for patients of all races.

The report also highlights concerns about AI-based devices, saying such technologies could exacerbate the underdiagnosis of heart disease in women, cause discrimination based on patients’ socio-economic status, and lead to underdiagnosis of skin cancer. Contains potential for diagnosis. For people with dark skin. They say concerns about the latter are due to the fact that AI devices are primarily trained on images with light skin tones.

The report also pointed to problems with polygenic risk scores, which are often used to provide a measure of an individual’s genetic risk of disease.

Enitan Carroll, Professor of Pediatric Infectious Diseases at the University of Liverpool and co-author, said: “The main genetic datasets used by polygenic risk scores are overwhelmingly for people of European descent, meaning they have other ancestry. “It may not be applicable to people.” of review.

However, attempts to correct bias can also be problematic. Among the examples highlighted in the report are race-based corrections being applied to readings from devices known as spirometers, which are used to assess lung function and diagnose respiratory conditions; has been found to contain bias.

Professor Habib Naqvi, chief executive of the NHS Race and Health Observatory, welcomed the findings and called for immediate fixes, fairness reviews and tougher guidance on pulse oximeters and other medical devices. It added that the review recognized the need for regulation.

“Access to better health should not be determined by ethnicity or skin color. Therefore, medical devices must be fit for purpose for all communities,” he said.

“It is clear that a lack of diverse representation in health research, a lack of robust equity considerations, and a dearth of co-production approaches contribute to racial bias in medical devices, clinical evaluations, and other medical interventions. .”

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News