Compliance » Mastering EU AI Act Compliance in a 6-Step Framework

Mastering EU AI Act Compliance in a 6-Step Framework

September 27, 2024

Mastering EU AI Act Compliance in a 6-Step Framework

It’s critical that legal teams start preparing for EU AI Act compliance, as Julia Apostle and Sarah Schaedler highlight in an article for Orrick, Herrington & Sutcliffe. Organizations have until Aug. 2, 2026 to comply with key provisions of the European Union’s Artificial Intelligence Act.

The authors outline six key steps to get your organization ahead of the curve:

  1. Designate an AI governance and compliance team: Organizations should form a cross-disciplinary team with expertise in legal, technical, and operational fields. The team will be responsible for ensuring that the company meets AI governance and compliance requirements. This group should include representatives from legal, IT, and product departments and be supported by AI advocates across the organization.
  2. Spearhead an AI governance framework: Organizations should create internal policies and procedures to align their use of AI with legal obligations, ethical priorities, and risk profiles. A foundational AI governance program should be flexible enough to adapt to future laws and technologies, supporting compliance with data protection and cybersecurity standards.
  3. Foster AI literacy: By February 2025, organizations must ensure employees and AI-related workers have adequate AI knowledge and competence. This can be achieved through standardized, organization-wide training and more specific role-based training for those directly involved in AI development and use.
  4. Take inventory of AI systems: A detailed inventory of AI technologies being developed, used, or sold by the organization is essential. This includes documenting the origin, functionality, input/output, and jurisdictions of each AI system.
  5. Assess AI Act applicability: Once the inventory is complete, organizations should determine whether their AI technologies are classified under the Act’s scope, including whether they fall under prohibited or high-risk AI systems.
  6. Develop long-lead compliance measures: Given that the Act’s obligations will roll out over time, organizations should start developing risk management frameworks, updating vendor processes, and preparing for documentation and transparency requirements. This will ensure readiness before the enforcement deadlines.

In addition to the general steps needed for EU AI Act compliance, Apostle and Schaedler make recommendations specifically for high-risk AI, general-purpose AI models, and individual user-facing AI.

To learn more about EU AI Act compliance, read past coverage in Today’s General Counsel here and here.

Critical intelligence for general counsel

Stay on top of the latest news, solutions and best practices by reading Daily Updates from Today's General Counsel.

Daily Updates

Sign up for our free daily newsletter for the latest news and business legal developments.

Scroll to Top