MedtecLIVE

My Account

Language

Market & Industry

Trust in Medical AI: New European Regulations Reshape Healthcare Innovation

With a deadline of August 2027, manufacturers must overcome complex certifications and safety protocols to maintain market access. Learn how these new regulations will impact AI-powered medical devices and reshape the future of healthcare innovation.

MedtecLIVE
Nürnberg, Germany

The healthcare industry stands at a critical juncture as new European regulations reshape how artificial intelligence (AI) can be deployed in medical applications. With the enforcement of Regulation EU – 2024/1689, medical device manufacturers face stringent requirements to ensure AI systems are safe, reliable, and trustworthy. The stakes are particularly high given that medical AI applications are classified as high-risk systems, requiring careful oversight and validation.

The New Regulatory Landscape: Understanding the AI Act Timeline

The European AI Act introduces a comprehensive framework with specific deadlines that healthcare organizations must meet. According to Dr. Andreas Schwab, Global Head Medical Software at TÜV Rheinland, the most critical date for medical device manufacturers is August 1, 2027. By this deadline, all AI-powered medical devices must complete a conformity assessment procedure with a notified body to maintain market access.

The regulation has already begun taking effect - Art. 2 AIA on prohibited practices has been in force since February 2, 2025. This creates urgency for manufacturers to review their existing AI systems and ensure compliance with the new regulation. The implementation timeline is particularly challenging in some member states, such as Germany, where political transitions have complicated the establishment of competent authorities.

Manufacturers of medical AI systems are advised to begin their conformity assessment procedures early, given the potential for bottlenecks as deadlines approach. The regulation's scope encompasses all medical devices incorporating AI, regardless of their classification under existing medical device regulations.

Risk Classification and Safety Requirements

Under the AI Act, most medical AI systems are automatically categorized as high-risk, requiring stringent oversight. This classification applies even to devices that might be considered lower risk under the Medical Device Regulation (MDR). A key focus is on ensuring that AI systems' training data accurately represents their intended use and patient population.

The regulations address critical safety concerns, particularly regarding AI bias and reliability. Dr. Schwab points out that an AI system trained with insufficient data can deliver potentially dangerous results. In one case, the system created artificial artifacts that could lead to misdiagnosis, demonstrating why robust validation is essential.

Key safety requirements include:

  • Verification that training data matches intended patient populations
  • Validation of AI system outputs against clinical requirements
  • Documentation of system limitations and potential risks
  • Regular monitoring and assessment of performance

Navigating the Certification Process

The certification process integrates with existing quality management systems while adding AI-specific requirements. Organizations must comply with both the AI Act and relevant medical device regulations (MDR/IVDR), creating a comprehensive framework for safety and effectiveness.

Documentation requirements under Annex 4-IV of the AI Act include detailed records of data governance, transparency measures, and human oversight protocols. The process also encompasses specific operational requirements outlined in Articles 10-15, covering aspects from data management to cybersecurity.

Self-learning systems present unique challenges but can be certified if they operate within predetermined boundaries. Any learning or adaptation beyond these boundaries requires additional notification and approval from notified bodies through a significant change notification (SCN) process.

Implementation Strategies and Compliance Solutions

Organizations can prepare for compliance by following established guidelines and best practices. The IMDRF's good machine learning practices guideline, published January 29th 2025, provides fundamental principles for training medical AI systems. Additionally, the questionnaire for artificial intelligence and medical devices, developed in collaboration with notified bodies, offers a practical framework for assessment preparation.

Transparency requirements apply to all AI systems, regardless of risk classification. While this doesn't necessarily mean systems must be fully explainable, organizations must clearly disclose the use of AI in their products. This requirement reflects a broader commitment to openness and trust in medical AI applications.

For practical implementation, organizations should focus on:

  • Documenting AI system specifications and limitations
  • Establishing robust validation procedures
  • Maintaining comprehensive training data records
  • Preparing early for conformity assessments

Conclusion

The European AI Act represents a significant shift in how medical AI systems are regulated and validated. While the requirements are demanding, they provide a clear framework for building trust in AI-powered healthcare solutions. Organizations that begin preparing now, following established guidelines and working with notified bodies, will be better positioned to meet the August 2027 deadline while maintaining their competitive edge in the evolving healthcare landscape.


Editorial notice:
This article is based on the corresponding presentation during MedtecLIVE Innovation Expo 2025 and was created with the support of AI. The supporting programme of MedtecLIVE 2026, which will take place from 5 to 7 May 2026 in Stuttgart, also offers numerous lectures. The trade fair brings together suppliers, providers from the development and production of medical technology, OEMs, distributors and other players in the medical technology community.

Your contact person

Dr. Andreas Schwab

Dr. Andreas Schwab

Global Head - Technical Comptence Center Medical Software, TÜV Rheinland AG

Share post

Related topics (2)