Review calls for action on bias with medical devices

News
addressing biases in medical device design
Freepik

Healthcare systems need to address the impact of racial and ethnic biases in the design and use of medical devices, says a new independent review conducted in the UK.

The review (PDF) was set up by the government to explore a suspicion that pulse oximeters, which estimate the level of oxygen in the blood and are widely used in the NHS, may be less accurate for patients with darker skin tones and could delay a referral for more intensive care.

It found “extensive evidence” that pulse oximeters tend to over-estimate oxygen levels in people with darker skin, a concern that first came to the fore during the COVID-19 crisis and may have contributed to worse outcomes for Black patients than White patients.

Other examples of medical device bias were also uncovered by the review, including, for example, that artificial intelligence algorithms used to support diagnosis from medical images may be unwittingly compromised by the use of “standard patient” data – which for the NHS means “typically White, male, relatively affluent, and born in the UK.”

It found evidence that AIs used to diagnose skin disorders and skin cancer have turned out to be less accurate when used on people with darker skin tones.

Meanwhile, the review also explored the use of polygenic risk scores (PRS), which combine the results of genomic testing to estimate an individual’s risk of disease and are sold directly to the public, but have not yet been adopted by the NHS.

Once again, there is plenty of evidence that these tests have a bias against groups with non-European genetic ancestry, which could introduce a risk of misinterpretation of results “by the public and health professionals alike,” according to the reviewers.

“Our review reveals how existing biases and injustices in society can unwittingly be incorporated at every stage of the lifecycle of AI-enabled medical devices, and then magnified in algorithm development and machine learning,” said Professor Dame Margaret Whitehead, who chaired the review.

The Department of Health and Social Care (DHSC) said it accepted the findings of the review and has implemented an action plan, including ensuring that pulse oximeter devices used in the NHS can be used safely across a range of skin tones, and moves to remove racial bias from data sets used in clinical studies.

It added that the Medicines and Healthcare products Regulatory Agency (MHRA) now requests that approval applications for new medical devices describe how they will address bias, while the National Institute for Health Research (NIHR) is currently accepting funding applications for research into smarter oximeter devices.

“This important review reinforces what experts have long known about the role of health data poverty and resulting biased AI in worsening underlying systemic health inequities,” commented Dr Sara Khalid, associate professor of health informatics and biomedical data science at the University of Oxford.

“It will be important to monitor if and how these practical recommendations influence real clinical practice.”

Image by Freepik