Navigating the EU AI Act
A visual guide to assessing your AI systems and achieving compliance with the world's first comprehensive AI regulation.
Understanding the Stakes
7%
Of annual global turnover is the potential penalty for non-compliance, highlighting the critical need for a thorough gap assessment.
What is a Gap Assessment?
It's a structured review that compares your current AI governance practices against the AI Act's requirements. This process identifies shortfalls ("gaps") and creates a roadmap for remediation, ensuring your organization is prepared and compliant.
Your 4-Step Compliance Journey
Classify Risk
Assess Compliance
Identify Gaps
Create Action Plan
Is Your AI System High-Risk?
The Act's most stringent rules apply to high-risk systems. Follow this decision flow to determine if your system falls into this category.
Criterion 1
Is the AI system a safety component of a product covered by existing EU product safety laws?
Criterion 2
Is the AI system listed in one of the specific high-risk use cases defined in Annex III?
If you answer "YES" to either criterion, your system is likely high-risk and must meet strict compliance obligations.
The Core Pillars of High-Risk AI Compliance
A hypothetical assessment showing a company's compliance status across the key requirements. Each pillar requires robust processes and documentation.
Risk Management
A continuous process to identify, analyze, and mitigate risks throughout the AI lifecycle.
Data Governance
Ensuring training and testing data is relevant, representative, and free of errors and bias.
Technical Documentation
Comprehensive documentation proving compliance, covering the system's purpose, design, and performance.
Human Oversight
Systems must be designed to allow for effective human supervision, intervention, and control.
Transparency & Info
Clear instructions for users on the system's capabilities, limitations, and how to interpret outputs.
Accuracy & Robustness
A high level of performance, resilience against errors, and protection from cybersecurity threats.
Prioritizing Your Action Plan
Based on the assessment, this chart visualizes the compliance gap size for each pillar. Focus remediation efforts on the areas with the largest gaps first.
Spotlight on Annex III: High-Risk Use Cases
Biometric Identification
Critical Infrastructure
Education & Training
Employment & Workers
Access to Services
Law Enforcement
Migration & Border Control
Justice & Democratic Processes
Roadmap to Full Application
~Q2 2024
AI Act enters into force.
~Q4 2024 (+6 months)
Rules on prohibited AI systems apply.
~Q2 2025 (+12 months)
Obligations for general-purpose AI models apply.
~Q2 2026 (+24 months)
All rules, including for high-risk systems, become fully applicable.