We use cookies to improve your experience and analyse site traffic.
The AI System Documentation Package is a twelve-module technical documentation artefact required under Article 11 and Annex IV of the EU AI Act. Every high-risk AI system must carry a complete AISDP before it can be placed on the market. The package covers system identity, development process, architecture, data governance, testing, risk management, human oversight, transparency, cybersecurity, record-keeping, fundamental rights impact assessment, and post-market monitoring.
The AISDP is the central compliance artefact required by Article 11 and Annex IV, comprising twelve modules that together document every aspect of a high-risk AI system from its identity and development through testing, deployment, and ongoing monitoring.
The aisdp is the central compliance artefact required by Article 11 and Annex IV, comprising twelve modules that together document every aspect of a high-risk AI system from its identity and development through testing, deployment, and ongoing monitoring. A full AISDP is required for high-risk systems classified under Annex III or the Annex I product legislation pathway.
Module 1 establishes system identity covering name, version, provider, intended purpose, and risk classification under Article 11 and Annex IV Section 1.
Module 1 establishes system identity covering name, version, provider, intended purpose, and risk classification under Article 11 and Annex IV Section 1. Module 2 documents the development process and methodology under Annex IV Section 2. Module 3 covers system architecture, model selection rationale, and design specifications. Module 4 addresses data governance, dataset documentation, lineage, bias assessment, and GDPR alignment under Article 10. Module 5 records model performance, fairness evaluation, and robustness testing under Article 9 and Annex IV Section 3.
Module 6 documents the risk management system including the risk register, FMEA results, and residual risk assessment under Article 9. Module 7 describes human oversight measures including the oversight interface, operator training, and break-glass procedures under Article 14. Module 8 covers transparency and user information including Instructions for Use and the Declaration of Conformity under Article 13.
Module 9 addresses robustness and cybersecurity including the threat model, security controls, and penetration test results under Article 15. Module 10 documents record-keeping covering the logging architecture, audit trail, and data retention under Article 12. Module 11 contains the Fundamental Rights Impact Assessment results under Article 27. Module 12 covers post-market monitoring and change history including the PMM plan, version history, change log, and registration details under Article 72.
Each module must be traceable to source evidence.
Each module must be traceable to source evidence. Every material claim requires a supporting artefact: a test result, a design document, a signed attestation, or a configuration record. The approach described throughout the guide is designed to produce that evidence as a natural byproduct of the engineering workflow rather than as a retrospective exercise in documenting.
The modules are cross-referenced and interdependent, reflecting Article 8's integrative mandate that compliance with any individual article cannot be assessed in isolation. Several domains contribute evidence to multiple modules, and certain modules draw on multiple domains. The relationship between the guide's domain sections and the twelve AISDP modules is not one-to-one.
The scope and depth of the AISDP varies by the system's risk tier.
The scope and depth of the AISDP varies by the system's risk tier. High-risk systems require the full twelve-module package described above. Limited-risk systems require a Standard AISDP covering a subset of modules with a focus on transparency measures and the classification rationale. Minimal-risk systems require only a Baseline AISDP documenting the classification decision and its supporting analysis.
Article 11(1) provides that SMEs may present documentation in a simplified manner, though the substantive compliance requirements remain unchanged. The Commission is empowered to establish a simplified form for this purpose, though as of early 2026 this form has not yet been published.
The guide examines eighteen interconnected domains across twenty-one sections.
The guide examines eighteen interconnected domains across twenty-one sections. Each domain addresses a distinct dimension of AISDP preparation, and together they cover the complete lifecycle of a high-risk AI system from initial risk classification through conformity assessment to post-market monitoring and eventual decommissioning.
The mapping is not one-to-one. Data governance contributes primarily to Module 4 but also feeds Modules 5 and 6. Cybersecurity feeds Module 9 but also draws on Module 3 for the system architecture and Module 10 for logging. Version control underpins Module 10 but supports traceability across all modules. Understanding this cross-cutting structure is essential for organisations planning their AISDP preparation.
A competent authority examiner who has never seen the system should be able to understand its design, behaviour, risk profile, and compliance posture from the AISDP alone.
A competent authority examiner who has never seen the system should be able to understand its design, behaviour, risk profile, and compliance posture from the AISDP alone. That is the self-standing test, and meeting it requires sustained effort across every compliance domain.
A deficient AISDP exposes the organisation to enforcement action, civil liability, and reputational harm. A well-constructed AISDP forces the organisation to understand its AI systems with a depth and precision that operational pressures routinely defer, revealing gaps in testing, governance, and architecture that might otherwise remain hidden.
The ten-year retention obligation under Article 18 applies from the date the system is placed on the market.
The ten-year retention obligation under Article 18 applies from the date the system is placed on the market. The complete version history, including all superseded versions and their evidence packs, must be retained for this period. The AISDP is maintained as a living document throughout the system's operational life, with each material change creating a new version. The version history demonstrates the organisation's continuous compliance discipline.
No. The AISDP is a living document that must evolve alongside the system. Article 72 requires post-market monitoring findings to feed back into documentation, and any system change triggers corresponding updates. Quarterly reviews are standard for high-risk systems.
An incomplete AISDP exposes the organisation to fines under Article 99 of up to EUR 15 million or 3% of global annual turnover for high-risk non-compliance, plus potential market withdrawal orders and mandatory corrective action plans.
Article 11(1) permits SMEs to provide documentation in a simplified manner, but the substantive obligations of Articles 8 through 15 apply equally. The simplification relates to format and presentation, not to the scope of requirements.
The evidence pack substantiates every claim in the AISDP through traceable artefacts. An evidence register catalogues each artefact and maps it to the AISDP modules it supports. Without this traceability, the AISDP contains only unverifiable claims.
Ten years from the date the system is placed on the market, including all superseded versions and evidence packs, under Article 18.
Every material claim requires a traceable artefact: test results, design documents, signed attestations, configuration records, or monitoring outputs catalogued in an evidence register.
The relationship is many-to-many: several domains contribute evidence to multiple modules, and certain modules draw on multiple domains. The governance pipeline feeds the widest range of modules.