We use cookies to improve your experience and analyse site traffic.
EU AI Act certification requires coordinated effort across multiple organisational functions. This guide maps nine certification roles to their responsibilities using a RACI framework, then presents team structures scaled to organisations managing five to over thirty AI systems.
Certification under the EU AI Act requires coordinated effort across multiple organisational functions, not a single compliance team working in isolation.
Certification under the EU AI Act requires coordinated effort across multiple organisational functions, not a single compliance team working in isolation. The RACI matrix below maps eleven core certification activities to six key roles, clarifying who is Responsible, Accountable, Consulted, or Informed at each stage. This structure ensures that no single individual or department carries the full burden, and that specialist input arrives at the right point in the process. From risk classification through to serious incident reporting, every activity has a clear owner and a defined chain of accountability.
| Activity | AI Governance Lead | AI System Assessor | Technical SME | Legal and Regulatory Advisor | DPO Liaison | CAC |
|---|---|---|---|---|---|---|
| Risk classification | A | R | C | C | I | I |
| Risk assessment | A | R | R | C | C | I |
| FRIA | A | C | C | C | R | I |
| Architecture review | I | C | R | I | I | I |
| Data governance | A | C | R | I | R | I |
| Conformity assessment | A | R | C | C | I | R |
| Declaration of Conformity | A | I | I | R | I | R |
| EU database registration | I | I | I | C | I | R |
| PMM operation | A | C | R | I | C | I |
| Serious incident reporting | A | C | C | R | C | I |
| Break-glass authorisation | A | I | R | I | I | I |
R = Responsible, A = Accountable, C = Consulted, I = Informed. CAC = conformity assessment Coordinator.
Nine distinct roles participate in the certification process, each contributing specialist knowledge to specific modules of the AI System Documentation Package.
Nine distinct roles participate in the certification process, each contributing specialist knowledge to specific modules of the AI System Documentation Package. The ai system assessor handles discovery, classification, risk assessment, and AISDP compilation for each system. This role examines the system against the Article 3(1) definition, classifies it within risk tiers, and performs gap assessment for brownfield systems. It demands a combination of regulatory and technical understanding, and covers all AISDP modules from a compilation perspective.
The Technical SME serves as subject-matter expert for the system's technical design, data, and operational behaviour. This role provides engineering evidence: architecture documentation, model evaluation results, data governance artefacts, and testing reports. Typically the engineering lead or senior ML engineer fills this position, contributing to AISDP modules 2, 3, 4, 5, 9, and 10 from an evidence standpoint.
The Technical Owner, usually the engineering lead or CTO for the system, ensures that design, implementation, and testing satisfy Articles 9 through 15. This role responds to technical queries during assessment and holds ownership responsibility for AISDP modules 2, 3, 4, 5, 9, and 10. The Business Owner, typically a product manager or business unit head, ensures that intended purpose, deployment context, and human oversight measures are correctly documented and operational. This role attests that the system operates within its documented intended purpose, covering modules 1, 7, 8, and 11.
The AI Governance Lead carries ultimate accountability across the entire certification process.
The AI Governance Lead carries ultimate accountability across the entire certification process. This role reviews and approves the AISDP, signs the declaration of conformity, and manages relationships with competent authorities. The AI Governance Lead holds authority to compel remediation, halt deployment, and allocate resources, with approval responsibility spanning all AISDP modules. Conformity Assessment for High-Risk AI Systems covers the broader conformity assessment process in which this role operates.
The Conformity Assessment Coordinator manages the end-to-end certification workflow. This includes coordinating the assessment plan, managing the Non-Conformity Register, preparing the Declaration of Conformity for signature, and managing EU database registration. The coordinator's process responsibility maps to AISDP module 12.
The Legal and Regulatory Advisor reviews evidence for legal sufficiency and advises on interpretation of novel or ambiguous requirements.
The Legal and Regulatory Advisor reviews evidence for legal sufficiency and advises on interpretation of novel or ambiguous requirements. This role also reviews the Declaration of Conformity for accuracy, operating in a cross-cutting capacity across all modules rather than owning specific deliverables.
The DPO Liaison confirms that data governance documentation is consistent with GDPR obligations and verifies that DPIAs are complete and current. This validation function maps specifically to AISDP module 4. Data Governance Under the EU AI Act examines the data governance requirements that this role validates against.
The Internal Audit Assurance Lead provides independent verification that the certification process was followed correctly, that evidence is complete and authentic, and that no material deficiencies were overlooked. This assurance function spans all AISDP modules, serving as the final quality gate before the organisation commits to its Declaration of Conformity.
Organisations managing five to ten AI systems can operate with a compact team structure.
Organisations managing five to ten AI systems can operate with a compact team structure. The AI Governance Lead should be a senior leader such as a CRO, CTO, or Head of AI Governance. One to two AI System Assessors handle classification and assessment work, supported by a Classification Reviewer who is an experienced compliance or risk professional and also serves as the holistic AISDP reviewer.
At this scale, the Conformity Assessment Coordinator may be the same individual as the Assessor. Technical SME support is typically drawn from engineering on a part-time basis rather than as a dedicated function. Legal, DPO, and internal audit functions contribute on a consultancy basis, engaging during specific certification milestones rather than maintaining continuous involvement. Building Internal Compliance Capability provides guidance on developing the skills these roles require.
Organisations with ten to thirty AI systems require a dedicated AI Governance team.
Organisations with ten to thirty AI systems require a dedicated AI Governance team. This team is led by the AI Governance Lead and includes two to four Assessors, a dedicated Classification Reviewer, a dedicated Conformity Assessment Coordinator, and a dedicated AI compliance engineering role.
At medium scale, legal, DPO, and internal audit functions provide dedicated capacity during certification cycles rather than operating on a consultancy basis. The shift from part-time to dedicated engagement during active certification reflects the increased volume and complexity that comes with managing a larger portfolio of systems.
Large enterprises managing thirty or more AI systems require a full AI Compliance Office led by the AI Governance Lead, who reports directly to the board or executive committee.
Large enterprises managing thirty or more AI systems require a full AI Compliance Office led by the AI Governance Lead, who reports directly to the board or executive committee. Multiple Assessors are organised by business domain, supported by a team of Classification Reviewers and multiple Conformity Assessment Coordinators managing parallel certification tracks.
At enterprise scale, the organisation maintains a dedicated AI compliance engineering team alongside embedded legal, DPO, and internal audit functions. The ability to run parallel certification tracks is a distinguishing characteristic: rather than processing systems sequentially, the enterprise structure supports concurrent assessment of multiple systems across different business domains.
In small organisations managing five to ten AI systems, the Conformity Assessment Coordinator may be the same individual as the Assessor. At medium scale and above, these roles should be separated.
At medium scale (ten to thirty AI systems), legal, DPO, and internal audit functions shift from consultancy-basis engagement to providing dedicated capacity during certification cycles.
Enterprise organisations maintain a full AI Compliance Office with embedded legal, DPO, and audit functions, domain-organised Assessors, and the ability to run parallel certification tracks across multiple business domains simultaneously.
The AI Governance Lead carries ultimate accountability, reviewing and approving the AISDP, signing the Declaration of Conformity, and managing competent authority relationships.
Small organisations managing five to ten AI systems operate with an AI Governance Lead, one to two Assessors, and a Classification Reviewer, with legal, DPO, and audit functions contributing on a consultancy basis.
Medium organisations with ten to thirty systems need a dedicated AI Governance team with two to four Assessors, a dedicated Classification Reviewer, a Conformity Assessment Coordinator, and a dedicated compliance engineering role.
Large enterprises with thirty or more systems establish a full AI Compliance Office with domain-organised Assessors, multiple Coordinators, and embedded legal, DPO, and audit functions enabling parallel certification tracks.