MENU

Report tackles bias in medical electronics

Report tackles bias in medical electronics

Technology News |
By Nick Flaherty



An independent review in the UK has highlighted areas of bias in medical electronics, particularly in AI.

The independent review was set up by the Secretary of State for Health and Social Care in the UK to look at the extent and impact of potential racial, ethnic and other factors leading to unfair biases in the design and use of medical devices and make recommendations for improvements

The review focussed on ‘optical’ medical devices, such as pulse oximeters and those assisted by artificial intelligence (AI).

The initial stimulus for this review was growing concern about the pulse oximeter, which estimates the level of oxygen in the blood and is in common use throughout the NHS in the UK. The COVID-19 pandemic highlighted that the pulse oximeter may not be as accurate for patients with darker skin tones as for those with light skin tones.

This mattered because an inaccurate reading could lead to harm if there was a delay in identifying dangerously low oxygen levels in patients with darker skin tones, which normally would have triggered referral for more intensive care.

The review found extensive evidence of poorer performance of pulse oximeters for patients with darker skin tones. These devices over-estimate true oxygen levels in people with darker skin tones to a greater extent than with lighter skin.

Evidence of harm stemming from this poorer performance has been found in the US healthcare system, where there is a strong association between racial bias in the performance of the pulse oximeters and delayed recognition of disease, denied or delayed treatment, worse organ function and death in Black compared with White patients. It did not find any evidence from studies in the NHS of this differential performance affecting care, but the potential for harm is clearly present.

The recommendations start with immediate mitigation measures in the NHS to ensure existing pulse oximeters can perform to a high standard for all patient groups to avoid inequities in health outcomes. The report goes on to recommend actions to prevent potential bias in further optical devices in the longer term.

AI has become incorporated into every aspect of healthcare, from prevention and screening through to diagnostics and clinical decision-making, such as when to step up intensity of care.

Existing biases and injustices in society can unwittingly be incorporated at every stage of the lifecycle of AI-enabled medical devices, and then magnified in algorithm development and machine learning says the report.

The problem is pressing than in the medical field, as the use of AI-enabled medical devices is now widespread and built-in bias may lead to poorer healthcare for the affected population groups. Seven of the recommendations are focused on actions to enable the development of bias-free AI devices.

The advent of large language and foundation models (such as ChatGPT) bring heightened concerns about the potential for these latest developments in AI to disrupt our clinical and public health practice in unpredictable ways.

The report calls for government action to initiate the thinking and planning that will be needed to face this inevitable disruption and potential unintended consequences arising from the AI revolution in healthcare.

www.gov.uk

 

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s