On the 2nd of February, member states of the European Union “approved the AI Act with an overwhelmingly favourable vote”, putting the rules "on track to become law" stated Dragoș Tudorache, MEP. This unanimous vote in favour of the Artificial Intelligence Act (“AI Act”), signals a significant step forward in AI regulation and it is now expected to be officially passed in either March or April of this year.

This begs the question; how should my company be ready for what is to come?

The AI Act imposes rigorous requirements on companies operating within the European Union that design or utilise AI technologies. With substantial penalties for non-compliance, the regulation aims to ensure both responsible and ethical AI practices across various sectors.

Violations of the Act could lead to fines ranging from €15 million or 3% of annual global turnover to as high as €35 million or 7% of annual global turnover, particularly for infringements related to prohibited AI systems.

In light of the impending enforcement of the AI Act, companies conducting business within the EU must prioritise compliance with the forthcoming legislation. Achieving compliance requires undertaking comprehensive gap analyses to identify existing governance structures, policies, processes, and risk categories needing enhancement to align with the Act's provisions. However, committing and executing the necessary steps to address such internal gaps presents a significant challenge, necessitating alignment across the organisation's functions and departments.

To navigate the requirements of the AI Act effectively the: board of directors, C-suite executives, and managers must adopt proactive measures to ensure compliance.



The board bears the ultimate responsibility for safeguarding the organisation against ethical, reputational, and regulatory risks associated with AI. Critical decisions facing the board include determining whether to pursue an AI Act-specific compliance program or a more comprehensive AI ethical risk management strategy.

The C-suite must lead the charge in designing, implementing, and scaling the compliance program, beginning with a thorough gap analysis to identify existing resources and potential alignment issues across various organisational functions. It is imperative to avoid relying solely on technology solutions and to prioritise continuous monitoring and the establishment of tailored metrics to track program effectiveness.

Managers, particularly those overseeing AI-related workflows, play a crucial role in operationalising compliance requirements within existing business processes. Ergo, customising workflows to accommodate evolving risk levels throughout the AI lifecycle and prioritising role-specific learning and development initiatives are vital steps in ensuring adherence to regulatory standards and thus, facilitating compliance.

As the EU prepares to roll out the AI Act in a few months’ time, companies must prioritise regulatory compliance alongside innovation.

Do you require assistance complying with the upcoming AI Act? GTG is here to help! Do not hesitate to contact Dr Ian Gauci for further clarification or assistance.

Disclaimer This article is not intended to impart legal advice and readers are asked to seek verification of statements made before acting on them.
Skip to content