AI in the medical devices sector: Fostering innovation while navigating evolving regulations

Digital
regulations digital AI visual

Among all the changes happening in the healthcare landscape in the last few years, the advent of digital health technology stands out as one poised to fuel innovation and revolutionise healthcare delivery.

In fact, artificial intelligence (AI) and machine learning (ML) have the potential to revolutionise several areas of healthcare, from diagnosis and treatment to patient monitoring and management. Not to mention underfunded areas of medical research, where these technologies could prove life changing by helping detect comorbidities, or environmental or genetic factors that place some individuals at higher risk of disease.

AI/ML tools for data analysis and acceleration

While concerns around potential harm and bias are considered when evaluating possible use case scenarios in which AI/ML can be employed, and with what degree of human intervention, more and more AI/ML tools are being developed in the healthcare sector to help manage vast amounts of data and interpret it speedily and accurately. Whether this data is in text form, video, or imagery, AI/ML can help save hours of manual analysis and cross-checking and suggest interpretation that would otherwise take human reviewers years to complete.

As the industry embraces these advancements, regulatory bodies everywhere are grappling with the complexities of evaluating and approving medical devices that do not conform to traditional paradigms and do not have a physical presence in the traditional sense.

Shifting from traditional regulatory paradigms

Regulatory bodies in the US, UK, and EU are making progress in defining and regulating AI/ML-enabled medical devices, with a focus on ensuring safety and efficacy. This evolving regulatory landscape, with new regulations like the EU AI Act, inevitably poses challenges, especially as digital health solutions can blur the lines between medical devices and non-medical tools.

In fact, differentiating between digital health and digital medical devices is not always simple, as digital health encompasses a spectrum of technologies, ranging from non-medical devices designed to monitor well-being through to medical devices tailored for specific medical purposes. The International Medical Device Regulators Forum (IMDRF) gives a consensus, but not exhaustive, definition for Software as a Medical Device (SaMD) as "software intended to be used for one or more medical purposes that perform these purposes without being part of a hardware medical device."

Additionally, in its June 2023 Roadmap, the MHRA confirmed it is developing guidance to help clearly identify SaMDs, differentiating them from that wide and confusing range of other tools, such as wellbeing and lifestyle software products, IVD software, and companion diagnostics.

This year, the EU has published the final text for the AI Act, which will regulate AI systems in multiple industries and also encompass medical devices. It specifies one of the core objectives for healthcare AI regulation: “in the health sector where the stakes for life and health are particularly high, increasingly sophisticated diagnostics systems and systems supporting human decisions should be reliable and accurate. The extent of the adverse impact caused by the AI system on the fundamental rights protected by the Charter is of particular relevance when classifying an AI system as high-risk.

As a result of wider definitions of high-risk AI uses, certain digital health products may also come under CE marking regulation for the first time under AI Act regulatory assessment of high-risk AI systems, where they are not considered to be medical devices. These digital health products will therefore now require notified body assessment and CE marking as an AI system, which will be based on different criteria to existing requirements for medical devices.

Innovation and regulatory sandboxes: An evolving landscape

In the US, the FDA has significant experience of successfully regulating AI/ML-enabled devices and has gone so far as to compile a publicly available list of such AI enabled tools with FDA marketing clearance.

The FDA have also considered the need for some AI/ML systems to be adaptively re-trained on new data or context specific data, so have introduced processes to enable certain pre-authorised software changes to be agreed by the manufacturer and the FDA, which can then be deployed without the need for further regulatory assessment. This is a significant milestone in regulatory innovation, as traditional assessment methods still used in EU medical device assessments can struggle to enable similar effective and proportionate regulation.

In the UK, a new regulatory sandbox, starting in pilot in July 2024, has been designed to provide a safe space for AI tool developers in healthcare to trial innovative AI products in view of regulators before they are implemented. Alongside the FDA and Health Canada, the MHRA has outlined 10 guiding principles that can inform the development of Good Machine Learning Practice (GMLP) that are safe, effective, and promote high-quality medical devices that use AI/ML.

In 2023, the MHRA also updated the Software and AI as a Medical Device Change Programme, to ensure future regulatory requirements for software and AI are clear and patients are protected. The Change Programme specifically builds on the intention to make the UK a globally recognised home of responsible innovation for medical device software, by achieving safety assurances, defining clear guidance and processes for manufacturers and liaising with key partners, such as the National Institute for Health and Care Excellence (NICE) and NHS England, but also with international regulators through the International Medical Device Regulators Forum (IMDRF).

The digital health landscape is still evolving and, given the heightened complexity of AI-based medical devices and digital health products, collaborating with regulatory experts and embracing multidisciplinary approaches is crucial for navigating these challenges, ensuring compliance with evolving standards and requirements, and facilitating market entry.

Image
Timothy Bubb
profile mask
Timothy Bubb