We use cookies to improve your experience and analyse site traffic.
Internal conformity assessment under the EU AI Act produces six core documents: an assessment plan, checklist, evidence register, non-conformity register, assessment report, and Declaration of Conformity. Each serves a distinct role in demonstrating that the assessment was rigorous and the declaration justified.
The internal conformity assessment generates its own documentation set, separate from the AI System Documentation Package itself.
The internal conformity assessment generates its own documentation set, separate from the AI System Documentation Package itself. This documentation provides the evidence that the assessment was conducted with rigour and that the resulting declaration of conformity is justified. Six core documents form this assessment record: an assessment plan, an assessment checklist, an evidence register, a non-conformity register, an assessment report, and the Declaration of Conformity under annex v.
Each document serves a distinct function within the assessment lifecycle. The assessment plan defines scope and methodology before work begins. The checklist provides the granular mapping of requirements to evidence. The evidence register catalogues every artefact examined. The non-conformity register tracks identified gaps through to resolution. The assessment report presents the formal findings and overall conclusion. The Declaration of Conformity is the legally binding output that the provider signs and retains for ten years after the system is placed on the market.
The assessment plan is prepared before the assessment begins and approved by the AI GOVERNANCE LEAD before work commences.
The assessment plan is prepared before the assessment begins and approved by the ai governance lead before work commences. It defines the scope of the assessment, specifying which system, which version, and which AI System Documentation Package modules are under review. It sets out the assessment methodology the team will follow, the composition of the assessor team and their qualifications, the assessment schedule, and the acceptance criteria against which conformity will be judged.
Preparing the plan in advance ensures that all participants understand what is being assessed, how findings will be categorised, and what constitutes a passing result. Without this upfront agreement, assessors risk applying inconsistent standards and the organisation lacks a baseline against which to measure the assessment's completeness. The AI Governance Lead's approval of the plan before assessment commences establishes organisational commitment to the defined scope and methodology.
The assessment checklist must map every requirement of Articles 8 to 15, Article 17, and Annex IV to specific questions, evidence expectations, and pass or fail criteria.
The assessment checklist must map every requirement of Articles 8 to 15, Article 17, and Annex IV to specific questions, evidence expectations, and pass or fail criteria. A single line item such as "Article 10 compliance" is insufficient. Each sub-requirement of Article 10 requires its own checklist entry: relevance, representativeness, freedom from errors, completeness, statistical properties, bias detection measures, and special category data processing each need a separate item with a dedicated evidence requirement.
This granularity serves two purposes. It forces the assessment team to examine each obligation individually rather than making a blanket judgement across an entire Article. It also produces a structured record that demonstrates to a competent authority exactly which requirements were examined, what evidence was reviewed, and what conclusion was reached for each one. The checklist becomes the primary working document during the assessment itself, guiding assessors systematically through every obligation. Conformity Assessment covers the broader assessment framework within which this checklist operates.
The evidence register is a catalogue of every artefact reviewed during the assessment, recording each item's location, version, date, and the assessment finding it supports.
The evidence register is a catalogue of every artefact reviewed during the assessment, recording each item's location, version, date, and the assessment finding it supports. It serves as the bridge between the assessment findings and the underlying proof, enabling any reviewer to trace a conclusion back to its source material.
Each entry in the register records the artefact identifier as a unique reference, the AI System Documentation Package module it supports, the EU AI Act Article it demonstrates compliance with, the artefact's current version and location, the date it was last updated, and the freshness requirement specifying how frequently the responsible role refreshes it. Columns should also include a next update due date and the responsible person for each artefact.
The register should be maintained as a structured dataset in a tool such as Airtable, a Notion database, a SharePoint list, or a YAML file in the documentation repository, rather than as free-form text in a document. The structured format enables automated currency tracking and gap reporting, which become essential as the volume of evidence artefacts grows over the system's operational life. Documentation Requirements provides detail on the underlying documentation obligations that generate these artefacts.
Every identified gap, deficiency, or inconsistency is recorded in the non-conformity register with a unique identifier.
Every identified gap, deficiency, or inconsistency is recorded in the non-conformity register with a unique identifier. Each entry carries a severity classification of critical, major, or minor, alongside a description of the finding, the affected AI System Documentation Package module and Article, the required remediation action, the responsible person, the remediation deadline, and the verification method that will confirm the issue has been resolved.
The assessment report is the formal output of the internal assessment. It summarises the assessment scope, methodology, findings, non-conformities, and the overall conclusion. That conclusion takes one of three forms: conformity demonstrated; conformity demonstrated subject to remediation of identified non-conformities; or conformity not demonstrated. The lead assessor signs the report and the AI Governance Lead reviews it before the organisation proceeds to issue the Declaration of Conformity.
The Declaration of Conformity under Annex V is the legally binding statement that the AI system conforms to the requirements of the EU AI Act.
The Declaration of Conformity under Annex V is the legally binding statement that the AI system conforms to the requirements of the EU AI Act. It is signed by the AI Governance Lead or an authorised representative. The declaration contains the information specified in Article 47 and Annex V: provider identity, system identification, a statement that the declaration is issued under the sole responsibility of the provider, the relevant harmonised standards or specifications applied, and the name and function of the signatory. The provider retains this declaration for ten years after the system is placed on the market.
The declaration is the culmination of the entire assessment process. It can only be issued once the assessment report concludes that conformity has been demonstrated, either unconditionally or subject to remediation of identified non-conformities where those remediations have been verified. The ten-year retention obligation applies not only to the declaration itself but to the full chain of supporting documentation: the assessment plan, checklist, evidence register, non-conformity register, and assessment report that together justify the declaration's conclusions.
A readiness review determines whether the system and its documentation are mature enough to undergo formal assessment, preventing wasted resources on a premature exercise.
A readiness review determines whether the system and its documentation are mature enough to undergo formal assessment, preventing wasted resources on a premature exercise. Conducting an assessment before documentation is complete generates a long non-conformity register listing gaps the team already knew about, which is both demoralising and inefficient.
The readiness review is a lightweight, structured check conducted by the Conformity Assessment Coordinator. It verifies that all AI System Documentation Package modules have been drafted and reviewed by their respective owners. The evidence register should contain at least a provisional entry for every material claim. Testing covering performance, fairness, robustness, and cybersecurity should have been executed with results available. The post-market monitoring system should be operational and producing data. Operators should have been trained and the human oversight interface deployed. No known critical or major gaps should remain unaddressed.
The review produces a brief report with a go or no-go recommendation. If the recommendation is no-go, the report includes the specific gaps the AI System Assessor identifies for resolution before assessment can proceed, with estimated timelines for each. This avoids the demoralising and inefficient cycle of assessment, mass non-conformity, remediation, and re-assessment. A clear go or no-go decision at this stage saves significant assessor time and ensures that the formal assessment, when it does proceed, produces a manageable and meaningful set of findings rather than a catalogue of known deficiencies.
Assessment documentation is fundamentally a document production exercise that benefits from templates and consistent structure.
Assessment documentation is fundamentally a document production exercise that benefits from templates and consistent structure. An assessment plan template should cover scope, methodology, schedule, and assessor qualifications. An assessment checklist maps Article-by-Article requirements to evidence sources with pass or fail criteria for each item.
An evidence register in spreadsheet form maps each requirement to its evidence artefacts, their versions, and their locations. The spreadsheet columns should include artefact ID, the AI System Documentation Package module, the EU AI Act Article, artefact description, current version, storage location, last updated date, freshness requirement, next update due date, and responsible person. A non-conformity register tracks gaps with severity, remediation actions, deadlines, and verification methods. An assessment report template should include sections for findings, determinations, conditions, and recommendations. Using consistent templates across assessments ensures comparability and reduces the effort of each subsequent assessment cycle.
Document management requires two core capabilities: version control and retention enforcement.
Document management requires two core capabilities: version control and retention enforcement. Version control ensures that every change to every document is recorded with the changer's identity, the timestamp, and the ability to retrieve any historical version. Retention enforcement ensures that documents cannot be accidentally deleted and that their retention period is automatically enforced.
At minimum, a shared file system with version control is needed. This can be as simple as a shared network drive with naming conventions. A shared storage location such as a network drive, SharePoint, or Google Drive must have restricted write access. A filename convention following the pattern of module, document name, version, and date is enforced by the Conformity Assessment Coordinator. Superseded versions are moved to an archive subfolder and never deleted. A document register spreadsheet tracks each document, its current version, location, owner, and retention expiry date.
For stronger version control, documentation repositories using version control systems provide the most robust change history: every change is a commit with attribution and a complete diff against the previous version. Confluence and SharePoint provide adequate version control for non-technical teams with lower proficiency in version control tooling. For long-term retention, documents are archived to cold storage such as S3 Glacier, Azure Archive Storage, or Google Archive Storage, with lifecycle policies that prevent deletion before the retention period expires. Annually, the Conformity Assessment Coordinator tests the archival mechanism by retrieving a document from archive storage and verifying its integrity. Quality Management Documentation covers the broader quality management context for these document controls.
Evidence currency tracking detects when evidence artefacts become stale.
Evidence currency tracking detects when evidence artefacts become stale. Each artefact in the evidence register has a freshness requirement: model evaluation reports are refreshed with every model update; post-market monitoring reports should be refreshed monthly; penetration test reports should be refreshed annually. A scheduled process scans the evidence register, compares each artefact's last-updated date against its freshness requirement, and generates a gap report listing overdue artefacts. The gap report is sent to the AI Governance Lead and the responsible team members. Overdue artefacts are treated as non-conformities and tracked in the non-conformity register.
Currency tracking can also operate manually, though this requires discipline. A monthly review of the evidence register compares each artefact's last-updated date against its freshness requirement. Overdue artefacts are flagged and the responsible person notified. The review is recorded in a monthly evidence currency report. The manual approach detects staleness at the next monthly review rather than in real time. A simple scheduled script comparing dates in the spreadsheet could automate this at near-zero cost.
The ten-year retention obligation under the EU AI Act deserves explicit planning because of the challenges a decade-long period presents.
The ten-year retention obligation under the EU AI Act deserves explicit planning because of the challenges a decade-long period presents. Over ten years, cloud accounts may be migrated, storage services may be deprecated, file formats may become obsolete, and the personnel who created the documentation will move on.
Retention planning should address storage durability through redundant storage with geographic distribution. Format longevity requires using open, widely supported formats such as Markdown, PDF/A, JSON, and CSV, avoiding proprietary formats that may lose support. Access continuity means ensuring that access credentials are not tied to individual personnel and are refreshed on a regular schedule. Index maintenance ensures that the evidence register itself is maintained and continues to accurately point to the artefact locations as storage structures evolve.
A biennial retention health check should verify all of these elements, confirming that archived documents remain retrievable, that formats remain readable, that access paths remain valid, and that the evidence register accurately reflects the current state of the archive. These archival controls are unglamorous, but their failure mode is severe: an assessor or competent authority requests documentation, and the organisation cannot produce it.
Yes. A spreadsheet with columns for artefact ID, AISDP module, EU AI Act Article, description, version, location, last-updated date, freshness requirement, next update due, and responsible person is adequate.
The review report identifies specific gaps for resolution with estimated timelines. Assessment is deferred until those gaps are addressed, avoiding a cycle of premature assessment and mass non-conformity.
Use open formats such as PDF/A and Markdown, store redundantly across geographic locations, ensure access credentials are not tied to individuals, and run a biennial retention health check.
Each gap is logged in a register with a unique ID, severity classification, description, affected module, required remediation, deadline, and verification method.
Provider identity, system identification, sole responsibility statement, harmonised standards applied, and signatory details as specified in Article 47 and Annex V.
To verify documentation maturity before committing resources, avoiding a premature assessment that generates a long list of already-known gaps.
Each artefact has a freshness requirement; scheduled reviews compare last-updated dates and flag overdue items as non-conformities.