We use cookies to improve your experience and analyse site traffic.
The EU AI Act introduces a precise vocabulary in Article 3 that carries specific legal meaning throughout the regulation. Compliance obligations, penalty thresholds, and documentation requirements all depend on correct application of these terms. This page defines the key actors, system classifications, market lifecycle events, and technical terms used throughout the Practitioners Implementation Guide.
The EU AI Act uses terms with specific legal meanings defined in Article 3.
The EU AI Act uses terms with specific legal meanings defined in Article 3. Understanding these definitions is essential because the regulatory obligations attach to the defined roles and concepts. The provider is a natural or legal person that develops an AI system or has one developed and places it on the market or puts it into service under its own name or trademark. The deployer is a natural or legal person that uses an AI system under its authority, except for personal non-professional use. Many organisations are both provider and deployer of their own AI systems, triggering dual obligations including the FRIA requirement under Article 27.
A high-risk AI system is one falling within the categories listed in Annex III, or constituting a safety component of a product covered by Annex I harmonisation legislation, and not qualifying for the Article 6(3) exception. The intended purpose is the use for which the provider intends the system, including specific context and conditions, as specified in the Instructions for Use, promotional materials, technical documentation, and Declaration of Conformity.
A substantial modification is a change after market placement not foreseen in the initial conformity assessment that affects compliance or modifies the intended purpose. Placing on the market is the first making available on the Union market, starting the ten-year documentation retention clock under Article 18. Putting into service is the supply for first use directly to the deployer or for own use in the Union. Conformity assessment is the process of verifying whether the Act's requirements have been fulfilled, typically internal under Annex VI for most high-risk systems.
The guide uses several technical terms with specific compliance significance.
The guide uses several technical terms with specific compliance significance. Compensating controls are technical and organisational measures that achieve the regulatory objective through alternative means when the primary recommended approach is unavailable. Each compensating controls section explains the challenge, the recommended tooling, and a procedural alternative for organisations that cannot implement the tooling.
An aisdp module is one of the twelve numbered sections of the AI System Documentation Package. Modules are the structural units of the AISDP itself. GPAI means general-purpose AI, defined in Article 3(63) as a model displaying significant generality and capable of performing a wide range of distinct tasks. Foundation models from providers such as OpenAI, Anthropic, Google, and Mistral are GPAI models. LLM refers specifically to large language models trained on text corpora for language tasks. RAG (retrieval-augmented generation) combines a GPAI model with a knowledge base, retrieving relevant documents at inference time.
The composite version is a structured identifier capturing the specific version of every artefact type deployed at a given point in time, covering code, data, model, configuration, prompts, and knowledge base. The composite version is the unit of compliance: the AISDP describes a specific composite version and the Declaration of Conformity attests to it.
PSI (Population Stability Index) measures the difference between two distributions, used throughout for detecting data drift and output distribution shift.
PSI (Population Stability Index) measures the difference between two distributions, used throughout for detecting data drift and output distribution shift. PSI below 0.10 is typically stable, 0.10 to 0.20 warrants investigation, and above 0.20 indicates significant shift. SHAP (SHapley Additive exPlanations) provides per-feature contribution scores for individual predictions based on game-theoretic Shapley values, used for explainability and feature importance monitoring.
FMEA (Failure Mode and Effects Analysis) is a structured risk identification methodology examining each component for potential failure modes, their effects, and detectability. SRR (Selection Rate Ratio) is the ratio of the selection rate for a protected subgroup to the reference group, with values below 0.80 indicating potential adverse impact under the four-fifths rule. AUC-ROC measures a binary classification model's ability to distinguish between classes.
A sentinel dataset is a fixed, curated evaluation dataset maintained for continuous behavioural monitoring, exercising the fairness, accuracy, and safety dimensions the AISDP documents. Break-glass is an emergency procedure enabling authorised personnel to halt AI system processing immediately, required by Article 14(4)(e). Grounding verification checks whether an LLM's outputs are supported by retrieved context in a RAG system.
Throughout the guide, Article references without further qualification refer to the EU AI Act. References to other legislation are given in full on first use.
Yes. Many organisations develop AI systems for their own use, making them both provider and deployer. This triggers dual obligations: the full provider AISDP plus deployer-specific requirements including the Fundamental Rights Impact Assessment.
Placing on the market is making the system first available on the EU market. Putting into service is supplying it for first use by a deployer. A system can be placed on the market before any deployer uses it. The distinction matters because different compliance obligations and clocks attach to each event.
A composite version is a structured identifier capturing the exact version of every artefact type deployed at a given point: code, data, model, configuration, prompts, and knowledge base. The AISDP describes a specific composite version, and the Declaration of Conformity attests to it, making it the fundamental unit of compliance.
Technical and organisational measures achieving regulatory objectives through alternative means when the primary approach is unavailable, impractical, or disproportionate.