AuditXYZ

Compliance Framework

Regulation (EU) 2024/1689 - Artificial Intelligence Act

The EU AI Act is the world's first comprehensive AI regulation. This guide covers risk classification, conformity assessments, transparency requirements, and the phased compliance timeline through 2027.

$50,000–$500,0006–24 monthsAudit Required2024
Issuing BodyEuropean Parliament and Council of the European Union
First Published2024-07-12
Latest Version2024
Typical Cost$50,000–$500,000
Typical Timeline6–24 months
Audit RequiredYes
Audit FrequencyConformity assessments required before high-risk AI systems enter the market. Post-market monitoring required on an ongoing basis. Phased enforcement through 2027.
Geographyeuropean-union, global

EU AI Act: Comprehensive Compliance Guide

The EU AI Act (Regulation 2024/1689) is the world's first comprehensive legal framework for artificial intelligence. Adopted in 2024 with phased enforcement through 2027, it establishes a risk-based regulatory approach that classifies AI systems into four risk tiers with corresponding obligations. The Act applies to providers and deployers of AI systems in the EU market, regardless of where they are established, with penalties up to 7% of global annual revenue.

What the EU AI Act Covers

The Act prohibits AI practices that pose unacceptable risk, including social scoring by governments, real-time biometric identification in public spaces (with limited exceptions), manipulation of vulnerable groups, and emotion recognition in workplaces and education. High-risk AI systems — including those used in critical infrastructure, education, employment, law enforcement, and migration — must meet stringent requirements before market placement.

High-risk system requirements include risk management systems, data governance for training data, technical documentation, record-keeping and logging, transparency and information to deployers, human oversight provisions, accuracy and robustness requirements, and cybersecurity measures. General-purpose AI models (including foundation models) face transparency and copyright-related obligations, with systemic risk models facing additional requirements.

Who Needs EU AI Act Compliance

The Act applies to providers (developers) of AI systems placed on the EU market, deployers (users) of AI systems within the EU, and importers and distributors of AI systems. It has extraterritorial reach — non-EU companies whose AI system outputs are used within the EU must comply. This means virtually every company developing or deploying AI for European customers needs to assess their obligations.

Implementation Timeline

Prohibited AI practices: enforceable from February 2025. GPAI model obligations: from August 2025. High-risk system requirements: from August 2026. Full enforcement: August 2027. Organizations should begin compliance programs now, starting with AI system inventory and risk classification.

Cost Considerations

Compliance costs range from minimal for low-risk AI systems to $200,000 to $500,000 for high-risk systems requiring conformity assessments, technical documentation, and monitoring infrastructure. GPAI model providers face additional costs for transparency compliance and systemic risk evaluation. The European Commission is developing harmonized standards and tools to reduce compliance burden, particularly for SMEs.

Get the EU AI Act starter pack

By submitting, you agree to our privacy policy.

Framework Mappings

Related frameworks

Get matched with a EU AI Act auditor in 24 hours

Free, no-obligation — just tell us your email and we'll do the rest.

By submitting, you agree to our privacy policy.

Recommended Tools