Bringing AI-enabled medical devices to market requires meeting overlapping FDA, EU MDR, and EU AI Act requirements. The AI Act classifies most medical AI as high-risk, introducing expectations for conformity assessment, data governance, transparency, and human oversight alongside MDR’s clinical evaluation and post-market surveillance. In the U.S., the FDA requires full traceability under design control, risk management per ISO 14971, and documented verification and validation of software, algorithms, and data changes across the lifecycle.
The challenge is not understanding the regulations, but operationalizing them. This session explores how to align development and regulatory workflows around shared data, maintain traceability across requirements, risks, and evidence, and reduce manual effort across submissions so teams can move faster while staying audit and submission-ready.
Registration Fees & Deadlines
Free
Learning Objectives
- Understand how FDA, EU MDR, and the EU AI Act define and enforce requirements, and how AI adds complexity through transparency, human oversight, and data governance.
- Connect shared development artifacts to regulatory evidence, including MDR clinical evaluation and FDA design controls, without duplicating work.
- Automatically generate compliant documentation for FDA, MDR, and EU AI Act submissions from shared data to reduce manual effort and accelerate timelines.
Who Should Attend?
Quality and Regulatory teams responsible for multi-region submissions who want to reduce manual effort, improve consistency, and accelerate approvals across FDA, MDR, and EU AI Act requirements.
Quality leaders looking to scale their impact by moving from document-driven processes to connected, audit-ready systems that reduce errors and rework.
R&D, Regulatory, and cross-functional leaders involved in bringing products to market across multiple regions and aligning development with compliance requirements.
AI and digital innovation teams exploring how to bring AI-enabled devices to market while meeting evolving expectations for transparency, governance, and validation.
Audience Learning Level
Basic: Content is introductory in nature and requires no requisite knowledge or experience to grasp concepts and related exercises. Basic educational activities are meant to establish a foundation of knowledge and/or competence that will be expanded upon in practice or in higher level activities.
Speakers
Jenn Dixon
Director of AI Quality & Regulatory Strategy, Ketryx
Jenn Dixon leads quality and regulatory client engagements at Ketryx, where she provides coaching and support for clients wishing to adopt the best software medical device practices. Dixon brings extensive expertise in quality assurance, regulatory affairs, and project management from previous roles in the MedTech industry with a proven ability to develop and apply best practices that optimize both teams and processes.
Prior to Ketryx, she was director of QA/RA at Omniscient Neurotechnology, overseeing global regulatory submissions, maintaining ISO 13485 compliance, and managing cross-functional teams in areas such as post-market surveillance and Good Machine Learning Practices (GMLP). She also served as a design assurance specialist and technical project manager, where she excelled in risk management, audit readiness, and leading agile development teams to deliver innovative solutions.
Dixon began her career at Synaptive Medical, supporting ISO 13485 compliance transitions and leading UDI implementation. She holds a bachelor’s degree in engineering from the University of Toronto.
Kevin Brady
Client Operations Engineer, Ketryx
Proof of Attendance
A certificate of attendance can be downloaded from the RAPS Learning Portal following the event.
Questions
Contact the RAPS Support Center:
Call +1 301 770 2920, ext. 200 (8:30 am–5:30 pm EST, Monday–Friday) or email
support@raps.org.