Regulations for AI/ML-enabled Medical Devices in US and EU
In the ever-evolving landscape of healthcare technology, the regulation of AI/ML-based medical devices is a critical concern. This blog post delves into the comparison between US and European policy implications regarding these innovative healthcare solutions. We explore the approval processes, transparency, and safety aspects that govern these devices on both sides of the Atlantic.
AI/ML-based Medical Devices: A Transatlantic Perspective
A cohort comprising of 124 AI/ML-based medical devices revealed an interesting trend. Many of these devices obtained the CE mark in Europe and FDA clearance in the USA during the search period (2015-2020). Surprisingly, these devices constituted a significant portion, with 56% gaining FDA approval and 52% receiving the CE mark. Notably, larger companies appeared more successful in obtaining both FDA approval and a CE mark compared to their smaller counterparts.
Among the devices that secured both FDA clearance and a CE mark, 80 were initially CE marked in Europe before obtaining authorized in the USA. This aligns with our experience, suggesting that the EU evaluation of medical devices may be less rigorous and decentralized compared to the FDA's approach. An FDA report even highlighted 12 devices that were exclusively approved in Europe but later deemed unsafe or ineffective. However, it's crucial to emphasize that further analyses are needed before drawing evidence-based conclusions.
Challenges in Assessing European Regulation
Despite this trend, evidence regarding the safety and performance of the European regulatory approach, particularly concerning AI/ML-based medical devices, remains scarce. Several factors contribute to the difficulty in studying CE-marked medical devices in Europe:
- Lack of Public Register: Unlike the USA, Europe lacks a publicly available register of approved devices.
- Confidentiality: The confidentiality of information submitted to Notified Bodies and regulators can hinder research efforts.
- Decentralized Pathway: CE-marking decisions follow a decentralized pathway, further complicating assessments.
However, there is hope on the horizon. In 2022, the European Commission introduced Eudamed2, a more comprehensive database. Although not entirely public, this repository offers some accessible information, including certificates of conformity and summaries of safety and clinical performance. This database is a significant step toward better understanding and evaluating the opportunities and risks of medical devices in Europe.
The Call for Transparency
In both the USA and Europe, transparency is a vital aspect of ensuring the safety and quality of AI/ML-based medical devices. Current summaries and statements of approved medical devices often lack sufficient information, making it challenging to identify AI/ML-based devices and understand their capabilities fully.
For instance, consider the device SnoreSounds. While the FDA describes it as "software for snoring evaluation," a more detailed description identifies it as software that employs neural network algorithms to analyze sleep breathing sounds related to airway collapse during apnea. Increased transparency from regulatory agencies, such as the FDA, regarding the AI/ML nature of medical devices can lead to improvements in clinical practice and provide clarity on device limitations.
Transparency benefits all stakeholders, including agencies, regulators, researchers, and manufacturers, by fostering a culture of learning from device successes and failures. It aligns with the ethical responsibility of regulators and manufacturers to prioritize patient safety and healthcare efficacy.
Navigating the Challenges
Detecting approved and CE-marked AI/ML-based medical devices presents challenges, primarily due to the lack of transparency and the inconsistent use of terminology by manufacturers and news sources. Overcoming these challenges will be crucial in advancing the regulation and understanding of these innovative healthcare technologies.
The information cited in this post is based on the following publication:
- Lancet Digit Health 2021; 3: e195–203; DOI: https://doi.org/10.1016/ S2589-7500(20)30292-2
Disclaimer - This post intended for informational purposes and does not constitute legal information or advice. The materials are provided in consultation with US federal law and may not encompass state or local law.