We use cookies to improve your experience and analyse site traffic.
Without portfolio-level governance, AI compliance effort scales linearly with the number of systems. Shared monitoring infrastructure, portfolio dashboards, standardised processes, and cross-system learning reduce per-system overhead. Board-level reporting integrates AI oversight with audit, risk, and compliance committee structures. Programme management coordinates parallel AISDP tracks to prevent resource bottlenecks.
An organisation's first high-risk AI system receives intensive attention from governance, engineering, and compliance functions.
An organisation's first high-risk AI system receives intensive attention from governance, engineering, and compliance functions. The second and third systems receive substantial effort. By the time the portfolio reaches ten or twenty systems, the oversight framework must scale or it will fail. The governance model designed for a single system will not survive a portfolio because each system has its own aisdp, its own risk register, its own monitoring infrastructure, and its own evidence repository. Without portfolio-level governance, the compliance effort scales linearly or worse with the number of systems, quickly exceeding the organisation's governance capacity.
Shared infrastructure is the foundation for portfolio scaling.
Shared infrastructure is the foundation for portfolio scaling. Monitoring infrastructure, evidence repositories, document management systems, and CI/CD pipelines are designed by the Technical Owner as shared services supporting multiple AI systems. The marginal cost of adding a new system to the monitoring infrastructure should be low. Shared monitoring with multi-tenant configuration using Prometheus and Grafana with per-system tags, or Datadog with system identifier labels, allows a single monitoring team to oversee all systems from a unified dashboard. Per-system metric labels enable both aggregate views showing how many systems have open non-conformities and per-system drill-down showing a specific system's current fairness drift status.
Shared infrastructure also enables cross-system analysis: detecting patterns such as a common vulnerability across multiple systems using the same GPAI model that would not be visible from individual system monitoring alone. Portfolio compliance dashboards aggregate the compliance posture across all systems into a single view for senior management. The dashboard shows for each system the current conformity status, the number and severity of open non-conformities, the PMM metric status, the evidence currency status, and the date of the last formal assessment. Credo AI and Holistic AI provide this as a built-in multi-system view.
Not all high-risk systems require the same intensity of oversight.
Not all high-risk systems require the same intensity of oversight. A credit scoring system that affects millions of consumers each year warrants more intensive oversight than an internal document classification system supporting administrative processes. The organisation defines oversight tiers based on the system's risk profile, deployment scale, and affected population sensitivity.
Higher-tier systems receive more frequent reviews, dedicated oversight personnel, and more granular monitoring. Lower-tier systems receive scheduled reviews, shared oversight personnel, and standard monitoring configurations. The AI Governance Lead documents the tier assignment and reviews it annually to account for changes in the system's deployment context or risk profile.
Centralised governance with distributed execution ensures consistent standards while keeping operational knowledge local. The AI Governance Lead provides central coordination: maintaining the portfolio-level risk register, ensuring consistent standards across systems, and reporting to executive leadership. Day-to-day oversight execution including monitoring, escalation handling, and operator training is distributed to the teams closest to each system.
Standardised processes are the primary mechanism for reducing per-system governance overhead as the portfolio grows.
Standardised processes are the primary mechanism for reducing per-system governance overhead as the portfolio grows. A common AISDP template, a common evidence taxonomy, a common non-conformity workflow, and a common assessment checklist mean the governance team applies the same process to every system, learning from experience across the portfolio. A finding in one system, such as a monitoring gap that was exploited, can be applied as a preventive check across all other systems.
Portfolio-level reporting gives executive leadership a consolidated view: how many systems are deployed, their aggregate compliance status, approaching review deadlines, highest residual risks, and resource constraints affecting oversight capability. The AI Governance Lead produces a quarterly portfolio report enabling leadership to allocate resources, set priorities, and make strategic decisions about the AI programme.
For one to three AI systems, portfolio management can be manual using a portfolio status spreadsheet with one row per system, updated monthly by the AI Governance Lead. Beyond three to five systems, the manual approach becomes unsustainable without disproportionate governance effort, and a compliance management platform becomes justified by the standardisation and automation it provides.
Yes. Systems sharing GPAI models, data sources, or infrastructure can share compliance documentation. A GPAI model risk assessment for one system can be adapted for another.
Across four axes: risk tier (highest-risk first), deployment timeline (imminent deadlines first), deployment scale (larger affected populations first), and compliance readiness (least-documented systems start earlier).
Monthly portfolio status review, quarterly resource review, and annual strategic review, sitting above the individual system governance cadences.
Prioritise by risk tier, deployment timeline, scale, and readiness. Stagger governance gates, identify cross-system synergies, and establish monthly, quarterly, and annual review cadences.