WATCH

The European Commission publishes the draft delegated act on high-risk AI systems — consultation open until 30 April 2026.

Back to Insights
GouvernanceFebruary 24, 202610 min read

AI Governance: what roles and responsibilities should your organisation establish?

AI Compliance Officer, AI Governance Committee, extended DPO: the AI Act does not prescribe a single organisational structure, but imposes clear responsibilities. How to structure them?

Why AI governance is an obligation, not an option

Article 4 of the AI Act requires providers and deployers of high-risk AI systems to take measures to ensure their staff have a sufficient level of AI literacy. Article 26 requires deployers to designate a person responsible for human oversight. These obligations imply a formalised organisational structure.

Key AI governance roles

AI Compliance Officer — Responsible for AI Act compliance at organisational level. Point of contact for national supervisory authorities. Manages the AI systems register and annual audits.

AI Governance Committee — Decision-making body bringing together CEO, CIO, DPO, CISO, Legal and business units. Validates decisions on adopting new AI systems and governance policies.

AI System Owner — Business owner of a specific AI system. Ensures day-to-day human oversight and escalates incidents to the AI Compliance Officer.

Policies to document

  • AI system adoption and assessment policy
  • AI risk management policy
  • Human oversight policy
  • AI incident management policy
  • AI training and competency policy

The AI systems register

Article 49 of the AI Act requires registration of high-risk systems in the EU database. Upstream, the Valyence™ AI Systems Register constitutes your organisation's internal reference: complete inventory, regulatory classification, compliance status and designated owner for each system.

Need expert guidance?

Assess your regulatory exposure

A Valyence™ AI Act Strategic Audit in 2 to 4 weeks.

Request an audit