Premium
This is an archive article published on March 16, 2024

Are pulse oximeters biased? Behind racial bias in health technology

A recent UK government-commissioned report found racial and ethnic bias in pulse oximeters — one of the most widely used devices to check blood oxygen levels during the Covid-19 pandemic.

PULSE OXIMETER BIASReadings on pulse oximeters are influenced by a number of factors, including skin pigmentagtion, tattoos, henna, etc. (Pixabay)

With the wanton use of pulse oximeters to measure blood oxygen levels during the Covid-19 pandemic, many raised concerns about their accuracy, especially with regards to potential inherent biases they may hold against racial and ethnic minorities.

As a result, Sajid Javid, the UK’s then Secretary of State for Health and Social Care, commissioned a study to investigate these claims in 2022. After 15 months of research, discussions with developers, regulators, healthcare workers and patients, and reviewing evidence from the public, a panel, headed by Professor Dame Margaret Whitehead from the University of Liverpool, submitted its report to the government in June 2023.

‘The Equity in Medical Devices: Independent Review’ published on March 11 examined three types of medical devices. These included: 1) optical medical devices, including pulse oximeters; 2) artificial intelligence (AI)-assisted medical devices; and 3) polygenic risk scores (PRS) that measure one’s genetic risk for affinity to diseases.

Story continues below this ad

Notably, the study found “considerable potential for racial bias that may lead to patient harm if it is not identified and addressed”.

What lies behind this bias?

According to the study, the overarching reason for the bias is the modelling, development and testing of devices on a “standard patient”, which, in this case, was “typically White, male, relatively affluent and born in the UK”.

For optical medical devices like oximeters, that rely on measurements taken through a patient’s skin with light, the report states that “the technology can be affected by several factors, including skin pigmentation, nail polish, motion, poor peripheral perfusion, fake tan, henna, tattoos, and sickle cell disease.”

Furthermore, the study points to multiple examples of incorrect or delayed diagnoses for women, racial minorities and people from deprived communities due to biased data from testing instruments. For instance, oximeters were tested only on white skin with those readings taken as “the norm” — leading to incorrect oxygen readings for Black people and other minorities.

Story continues below this ad

Even for AI-assisted devices, the issue is similar, with a lack of representation of diverse data sets when designing measurement scales. Moreover, ignoring regional, socio-cultural and economic factors behind health data further dents chances of accurate interpretation.

“The data are often indirect measures, reflecting the patient’s interactions with the healthcare system as well as their health status… So analysing health records requires an awareness of the context in which they were generated. Without the context, data from health records are unsuitable for many research questions,” the study said.

For polygenic risk scores, Enitan Carrol, co-author of the review and professor of paediatric infection at the University of Liverpool, said: “Major genetic datasets that polygenic risk scores use are overwhelmingly on people of European ancestry, which means that they may not be applicable to people of other ancestries.”

Is this bias in technology new?

No, discrimination in technological devices and programs is not new. The fact that the performance of optical medical devices is worse for those with darker skin has been known since 1992.

Story continues below this ad

But the true consequences of this difference on health and healthcare have only surfaced recently. The study found evidence of harm stemming from poorer performance of health technology in the US healthcare system, “where there is a strong association between racial bias in the performance of the pulse oximeters and delayed recognition of disease, denied or delayed treatment, worse organ function and death in Black compared with White patients.”

Even in non-medical cases, from facial recognition software that incorrectly identifies people, to no-touch faucets and soap dispensers that do not recognise non-white hands and palms, there is plenty of evidence to suggest that racism has seeped into technology.

What are stakeholders doing about it?

The UK government has accepted all 18 recommendations, 51 sub-recommendations and three further calls to action laid out in the report. The recommendations largely urge the prioritisation of more diverse data in terms of different skin tones and genetics, lifestyles, and other cultural factors.

It also asks developers to maintain transparency on the limitations of devices; regulators to continually monitor the deployment of such tools; and NHS and other healthcare units to comply with said regulations and routinely request feedback from the public, experts and healthcare providers to plug the gaps.

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement