AI Compliance & Governance
EU AI Act

EU AI Act & Governance

AI Compliance & Governance

Systematically meet regulatory requirements for AI. From EU AI Act classification to a comprehensive AI governance framework.

EU AI Act

The EU AI Act is the world's first comprehensive AI regulation and applies to all organisations that use or offer AI systems in the EU. It classifies AI systems by risk level – from prohibited AI through high-risk AI to AI with limited or minimal risk. We help you classify your AI systems and meet the applicable requirements for each category.

ISO/IEC 42001 – AI Management System

ISO/IEC 42001 is the international standard for AI management systems. Similar to ISO 27001 for information security, it provides a systematic framework for responsible AI deployment. We guide you through building and certifying an AI management system, including gap analysis, implementation, and audit preparation.

AI Governance Framework

Effective AI governance goes beyond mere rule compliance. We help you develop a holistic governance framework that clearly defines AI responsibilities, decision-making processes, oversight mechanisms, and escalation paths – aligned with the EU AI Act and other applicable regulatory requirements.

EU AI Act: Timeline and Specific Obligations

The EU AI Act has been in force since August 2024, with requirements applying in stages: prohibited AI practices since February 2025; requirements for high-risk AI systems under Annex I from August 2026; AI systems under Annex III (including employment, education, biometric classification) from August 2026; General-Purpose AI Model obligations from August 2025. Organisations acting now gain a significant compliance lead over those waiting until the deadlines.

High-risk AI under the EU AI Act requires: a risk management system (Art. 9), data governance (Art. 10), technical documentation (Art. 11), logging and traceability (Art. 12), transparency and user information (Art. 13), human oversight (Art. 14), and accuracy and robustness (Art. 15).

Typical Compliance Scenarios

Recruitment firm with AI-assisted candidate screening

A recruitment firm uses AI for automated pre-screening of applicants. This system falls under high-risk AI per EU AI Act Annex III. We support documentation, human oversight concepts, conformity assessment preparation, and readiness for supervisory authority enquiries.

Manufacturing company with AI in safety-critical monitoring

An industrial company deploys AI for anomaly detection in safety-relevant production processes. We develop the required quality management system under EU AI Act Art. 17 and the complete technical documentation for regulatory authorities.

SaaS provider targeting ISO/IEC 42001 certification

A software company wants to demonstrate ISO/IEC 42001 certification to its enterprise customers as evidence of responsible AI deployment. We support the build-out of the AI management system from gap analysis through to certification audit.

Compliance Services

  • EU AI Act Risk Assessment
  • AI System Classification
  • Conformity Assessment Preparation
  • ISO/IEC 42001 Implementation
  • AI Governance Framework
  • Documentation & Transparency Obligations
  • AI Impact Assessments
  • Training for AI Teams & Management

Assess Your AI Compliance

Find out whether your AI deployment meets regulatory requirements.

Request Consultation

Frequently Asked Questions about AI Compliance

Does the EU AI Act apply to our organisation?

The EU AI Act applies to all organisations that use or offer AI systems in the EU – regardless of where the organisation is headquartered. Both providers (who develop or market AI) and deployers (who use AI within their organisation) are covered. The extent of obligations depends on the risk classification of the AI systems in use.

What is the difference between the EU AI Act and the GDPR for AI?

The GDPR regulates the handling of personal data – including data used to train AI or processed by AI systems. The EU AI Act regulates the AI systems themselves: their development, deployment, and use. Both regulations can apply simultaneously and must be aligned with each other.

What is ISO/IEC 42001 and do we need it?

ISO/IEC 42001 is the international standard for AI management systems – the AI equivalent of ISO 27001. It provides a structured framework for responsible AI deployment. Certification can become a competitive differentiator in B2B contexts but is not legally mandated.

When must high-risk AI be compliant?

For high-risk AI systems under Annex III (including employment, critical infrastructure, education, biometric classification), full EU AI Act compliance applies from August 2026. Systems newly placed on the market after this date must meet the requirements. Existing systems have a transition period.

Kontakt aufnehmen

Master EU AI Act and AI Governance

We help you treat AI compliance not as a burden, but as a competitive advantage – and implement it effectively.