Loading Events

« All Events

EU AI Act: Rest of Act Now Applies (except Article 6(1)), Including for High-Risk AI Systems

August 2

Per Article 111(2), the AI Act shall apply to operators of high-risk AI systems:

  • Systems placed on the market before the compliance date, and then significantly changed, must comply from that point onward. If the high-risk AI system was placed on the market prior to the compliance date, and does not make any major changes, then they do not need to take any action.
  • Systems newly placed on the market on or after the compliance date must comply immediately.

 

Compliance requirements include:

  • Implement an AI QMS per Article 17 of the AI Act (can be integrated with ISO 13485)
  • Conduct additional conformity assessment under AI Act Title III, Chapter 4. May be combined with MDR NB assessment.
  • Prepare AI-specific documentation per Annex IV of the AI Act
  • Follow AI-specific lifecycle risk management (Article 9), distinct from ISO 14971
  • Provide information on AI system capabilities, limitations, and performance per Art. 13
  • Define and validate mechanisms for human oversight of the AI system (Article 14)
  • Ensure automatic logging of system events for traceability (Article 12)
  • Develop a plan for monitoring system performance (Article 61)

 

Examples of high-risk AI systems include:

  • AI-Based Diagnostic Imaging Software
  • AI Triage Software for Emergency Departments
  • IVD Software Using AI to Predict Risk of Genetic Disorder

 

For a high-level overview of High-risk AI requirements, please visit this Commission webpage (scroll 1/3 down). More information on the deadlines can be found here.

Other

Legislation
EU AI Act
Region
Europe