We use cookies to improve your experience and analyse site traffic.
Annex VI of the EU AI Act establishes the internal conformity assessment procedure for high-risk AI systems. Providers must verify compliance across three workstreams: quality management system conformity with Article 17, technical documentation assessment against Articles 8 through 15, and consistency between documented design and actual system behaviour.
Annex VI requires providers to verify compliance across three distinct workstreams: quality management system conformity with Article 17, technical documentation assessment against Articles 8 through 15, and consistency between design, development, and post-market monitoring processes and the technical documentation.
Annex VI requires providers to verify compliance across three distinct workstreams: quality management system conformity with Article 17, technical documentation assessment against Articles 8 through 15, and consistency between design, development, and post-market monitoring processes and the technical documentation. Each workstream examines a different facet of the system's compliance posture, but together they form a complete internal conformity assessment.
The provider must demonstrate that the quality management system not only exists on paper but operates in practice. The technical documentation must accurately describe the system as built and deployed, with claims supported by traceable evidence. The development process itself must align with the documented methodology, and post market monitoring must be producing the data described in the documentation. Conformity Assessment for High-Risk AI Systems covers the broader assessment framework within which these workstreams operate.
The QMS assessment verifies that every element required by Article 17 exists as documented policy, is version-controlled and accessible, and is operational with evidence of execution.
The QMS assessment verifies that every element required by Article 17 exists as documented policy, is version-controlled and accessible, and is operational with evidence of execution. Article 17 defines these elements as: a compliance strategy, design and development techniques and procedures, examination procedures prior to market placement, quality control procedures, data management procedures, record-keeping obligations, a resource management framework, an accountability framework, conformity assessment procedures, a corrective action process, and a post-market monitoring plan.
"Exists" means there is a written policy or procedure addressing the requirement. "Documented" means the documentation is current, version-controlled, and accessible to the persons who need it. "Operational" means the procedure is actually followed in practice, with evidence of execution such as audit trails, sign-off records, meeting minutes, and training records.
The assessor should work through the QMS requirements systematically, mapping each Article 17 sub-requirement to the specific policy, procedure, or system that satisfies it. Where a requirement is partially satisfied, the Conformity Assessment Coordinator records the gap in the Non-Conformity Register with a remediation plan and timeline. Where a requirement is not satisfied at all, the system cannot pass the internal conformity assessment until remediation is complete.
The technical documentation assessment examines the AI System Documentation Package against the requirements of Articles 8 through 15 and Annex IV.
The technical documentation assessment examines the AI System Documentation Package against the requirements of Articles 8 through 15 and Annex IV. This is a technical verification exercise, not a document review in the literary sense. The assessor must confirm that the documentation accurately describes the system as built and deployed, that the claims made are supported by evidence, and that the evidence is sufficient, authentic, and current.
For each module of the documentation package, the assessor should verify four dimensions. Content completeness asks whether the module addresses every sub-requirement of the corresponding Article and Annex IV section. Evidence sufficiency asks whether every material claim is supported by a traceable artefact: a test report, a configuration record, a signed attestation, or a design document. Consistency asks whether the information in the module aligns with other modules, the risk register, test results, and the deployed system's actual configuration. Currency asks whether the documentation reflects the current state of the system rather than a historical state. Technical Documentation Requirements provides detailed guidance on the documentation structures that support this assessment.
The third workstream verifies that the system was built in accordance with the documented design, that the development process followed the documented methodology, and that the post-market monitoring system is operational and producing the data described in the documentation.
The third workstream verifies that the system was built in accordance with the documented design, that the development process followed the documented methodology, and that the post-market monitoring system is operational and producing the data described in the documentation. This requires the assessor to trace from the documentation package to the source artefacts.
If the documentation states that the model was trained on a specific dataset version using particular hyperparameters, the assessor must verify that the model registry confirms this, that the dataset version is retrievable from the data versioning system, and that the training pipeline logs corroborate the documented process. If the documentation states that post-market monitoring includes monthly fairness metric computation, the assessor must verify that the monitoring system is producing these reports and that they are being reviewed. The objective is to close any gap between what the documentation claims and what the system actually does.
The three workstreams are conceptually distinct but in practice overlap and inform each other, so assessors need a structured methodology that sequences the assessment efficiently and catches inconsistencies early.
The three workstreams are conceptually distinct but in practice overlap and inform each other, so assessors need a structured methodology that sequences the assessment efficiently and catches inconsistencies early. The assessment proceeds through five phases.
Phase one is the desktop review: the assessor reads the complete documentation package and supporting materials without accessing the live system, identifying gaps, internal inconsistencies, and missing evidence references. This produces an initial findings list and clarification questions for the Technical SME and Business Owner, typically taking two to five days.
Phase two is evidence verification: the assessor works through the evidence register, confirming that each referenced artefact exists, is accessible, is the correct version, and supports the claim it is cited for. This produces a verified evidence log and Non-Conformity Register entries for missing, expired, or insufficient evidence, typically requiring three to eight days.
Phase three is live system verification: the assessor examines the deployed system to confirm that behaviour matches documentation, covering production configuration, the human oversight interface, post-market monitoring reports, alert logs, monitoring dashboards, and logging infrastructure. This phase produces a configuration verification record and discrepancy findings, typically taking two to four days. Post-Market Monitoring Systems addresses the monitoring infrastructure that this phase examines.
Phase four consists of stakeholder interviews: the assessor interviews the Technical SME on architecture, testing, and deployment; the Business Owner on intended purpose, oversight model, and escalation; and Operators on override capability and exercise criteria. This produces interview records and findings on training or communication gaps, typically requiring one to three days.
The QMS is the organisational framework that ties all technical controls together into a governed, auditable process.
The QMS is the organisational framework that ties all technical controls together into a governed, auditable process. Article 17 requires providers of high-risk AI systems to establish a QMS that ensures compliance throughout the system's lifecycle. The QMS is not a separate system layered on top of the technical infrastructure; it is the governance structure within which the technical infrastructure operates.
ISO 42001, the Artificial Intelligence Management System standard published in December 2023, provides the most directly relevant framework. It specifies requirements for establishing, implementing, maintaining, and continually improving an AI management system. Its control set aligns with the EU AI Act's requirements, covering risk management, data management, system engineering, verification, validation, deployment, operation, and monitoring. Certification to ISO 42001 does not constitute EU AI Act conformity assessment, but it provides a structured foundation that makes conformity assessment significantly more efficient.
The QMS must address four core areas. Document control requires that every documentation module, every procedure, and every evidence artefact has a defined owner, a version history, a review cycle, and a defined retention period. Change management requires that every change to the system, whether to code, data, model, configuration, or infrastructure, flows through a controlled process with defined approval authority. Non-conformity management requires that when a gap between the system's actual state and its declared compliance state is identified through monitoring, assessment, or incident, the Conformity Assessment Coordinator logs it, assesses it, assigns it to an owner, tracks it to closure, and verifies the fix. Tooling such as Jira or ServiceNow with pre-configured non-conformity workflows supports this process. Continual improvement requires mechanisms for learning from incidents, assessment findings, and operational experience, then feeding those lessons back into the system's design and governance.
A QMS is fundamentally a set of documented procedures, not a software platform.
A QMS is fundamentally a set of documented procedures, not a software platform. ISO 42001 certification can be achieved with paper-based or spreadsheet-based documentation.
A QMS manual describes the organisation's AI management policies, procedures, roles, and responsibilities. A document control register in spreadsheet form tracks every controlled document along with its current version, owner, review date, and retention period. A non-conformity register tracks every identified gap, its severity, root cause, corrective action, owner, deadline, and verification status. Internal audit schedules and records, along with quarterly management review meeting minutes, round out the requirements.
Workflow automation, dashboard views, and integrated reporting are lost with this approach. Manual QMS management works for organisations with a small number of AI systems, though it scales poorly as the number of systems or complexity of the governance requirements increases.
Conformity demonstrated, conformity demonstrated subject to remediation, or conformity not demonstrated. The conclusion is reached during the synthesis phase after consolidating findings from all assessment phases.
The five phases typically span 10 to 23 days in total: desktop review (2-5 days), evidence verification (3-8 days), live system verification (2-4 days), stakeholder interviews (1-3 days), and synthesis (2-3 days).
The Conformity Assessment Coordinator records the gap in the Non-Conformity Register with a remediation plan and timeline. Where a requirement is not satisfied at all, the system cannot pass until remediation is complete.
The assessor verifies four dimensions for each documentation module: content completeness against the relevant Article and Annex IV section, evidence sufficiency, consistency with other modules and the live system, and currency.
A five-phase methodology: desktop review, evidence verification, live system verification, stakeholder interviews, and synthesis leading to one of three conclusions on conformity status.
ISO 42001 provides a structured AI management system framework aligned with EU AI Act requirements. Certification does not constitute conformity assessment but makes the process significantly more efficient.
Yes, a QMS can operate with paper-based or spreadsheet-based documentation including a QMS manual, document control register, and non-conformity register, though this scales poorly.
Phase five is synthesis and reporting: the assessor consolidates findings from all phases, classifies non-conformities by severity, and reaches an overall assessment conclusion. The three possible conclusions are conformity demonstrated, conformity demonstrated subject to remediation, or conformity not demonstrated. This phase typically requires two to three days.