We use cookies to improve your experience and analyse site traffic.
The fifth governance gate is a mandatory human approval step that cannot be satisfied by automated evaluation. Article 14's human oversight requirement applies to the deployment decision itself. Approval authority escalates with change classification, from Technical SME for routine changes to joint AI Governance Lead and Legal Advisor for substantial modifications.
The fifth governance gate is the final human decision before production deployment and is a mandatory human approval step that cannot be satisfied by automated evaluation.
The fifth governance gate is the final human decision before production deployment and is a mandatory human approval step that cannot be satisfied by automated evaluation. Article 14's human oversight requirement applies to the deployment decision itself: the decision to expose affected persons to a modified system is a consequential decision that must be taken by a qualified human with full visibility into the evidence.
The designated approver reviews the complete evidence chain from the current pipeline execution: the model evaluation report, the risk gate record, the fairness gate record, the documentation currency record, the security scan results, and the staging validation results. The approver may also review the change classification determination to confirm that the change has been correctly categorised. The approver's decision is recorded with four fields: the decision as approve, reject, or defer; the approver's identity; the timestamp; and the evidence reviewed.
The approval authority depends on the change's classification, creating a tiered governance structure that matches the rigour of review to the significance of the change.
The approval authority depends on the change's classification, creating a tiered governance structure that matches the rigour of review to the significance of the change. Routine changes, those that do not affect the model architecture, the intended purpose, or the fairness profile, may be approved by the Technical SME alone. This allows standard engineering changes to proceed without creating bottlenecks at the governance level.
Significant changes that alter fairness metrics, modify the feature set, or change the model architecture require approval from the AI Governance Lead. The escalation ensures that changes with potential compliance impact receive review from the person accountable for the system's overall compliance posture.
Changes classified as substantial modifications under Article 3(23) require approval from the AI Governance Lead and the Legal and Regulatory Advisor jointly. This dual approval recognises that a substantial modification may trigger re-assessment under the conformity assessment framework, potentially requiring a new Declaration of Conformity. The Legal and Regulatory Advisor's involvement ensures that the regulatory implications of the substantial modification are assessed before deployment proceeds. If re-assessment is triggered, the system returns to the pre-deployment validation phase, and a new Declaration of Conformity must be signed before production deployment can resume.
The gate produces a Deployment Authorisation Record containing the approver's identity, the decision, the timestamp, the evidence references reviewed, and any conditions attached to the approval.
The gate produces a Deployment Authorisation Record containing the approver's identity, the decision, the timestamp, the evidence references reviewed, and any conditions attached to the approval. Conditions might include restrictions on the deployment's scope, requirements for enhanced monitoring during an initial period, or a deadline for completing a follow-up action such as updating the FRIA. The record is deposited in the governance artefact registry and referenced by aisdp Module 12.
The approver bears personal accountability for the deployment decision. This accountability is not merely procedural: the Conformity Assessment Coordinator reviews the authorisation record during the internal assessment, verifying that the approver had the authority for the classification level, that the evidence package was complete, and that the decision was documented. The AI Governance Lead is responsible for defining and maintaining the approval authority matrix that maps change classifications to required approvers.
The deployment authorisation gate must be implemented as a blocking pipeline step that pauses execution and presents the evidence package to the designated approver through a review interface.
The deployment authorisation gate must be implemented as a blocking pipeline step that pauses execution and presents the evidence package to the designated approver through a review interface. The approver cannot bypass the gate; the pipeline resumes only upon receipt of an explicit approval event. GitHub Actions, GitLab CI/CD, and Azure DevOps all support manual approval gates natively.
The Technical SME configures the pipeline to require approval from the designated reviewer before the deployment job executes, using the platform's environment protection rules. The approval event is captured in the pipeline's execution log, which serves as the authorisation record. For GitHub Actions, this is implemented through the Environments feature: the production environment is configured to require review from designated approvers, and the deployment job specifies this environment, causing the workflow to pause until approval is granted.
For organisations using ArgoCD or Flux for GitOps deployment, the approval gate is implemented as a pull request approval requirement on the deployment repository. The AI Governance Lead or Technical SME submits a pull request containing the updated deployment manifests; the designated approver reviews the evidence and approves the pull request. The merge event triggers deployment, and the pull request's review history serves as the authorisation record with full traceability.
The evidence assembly step retrieves all governance gate records from the pipeline execution and presents them as a summary for the approver, including the gate name, result, and timestamp for each gate. A verification step confirms all gates passed before presenting the deployment for approval, preventing the approver from being asked to authorise a deployment that has already failed a governance gate.
No. Even routine changes require a human decision. The Technical SME must review the evidence and explicitly approve. The requirement is for human oversight of the deployment decision, not merely automated threshold checking.
Delegate arrangements should be documented in the approval authority matrix. The pipeline remains paused until an authorised approver acts. Approval latency is monitored as a governance health metric.
Yes. Conditions might include time limits (deploy within 48 hours), enhanced monitoring requirements, or scope restrictions. Conditions are tracked and their satisfaction verified before or during deployment.
Deployment pauses until conformity re-assessment is complete and a new Declaration of Conformity is signed. The Legal Advisor determines re-assessment scope.