Market Pulse

Europe’s AI Act is quietly rewiring the modern enterprise.

12 December 2025 | AIMG
In Brussels, regulation is rarely content to sit politely at the edge of the market. The European Union’s Artificial Intelligence Act - formally Regulation (EU) 2024/1689 - does not merely tell companies what not to do. It tells them how to run the machine: how to document it, monitor it, test it, and explain it, especially when it might bruise citizens’ rights.

Executives are tempted to treat this as a technology problem: give it to the data scientists, buy a compliance tool, and move on. That is a category mistake. The Act’s design is organisational. It assumes AI systems will be developed and used across borders and supply chains, and it captures not only European firms but also certain non‑European providers and deployers when the output is used in the Union. That alone is enough to make “EU exposure” a boardroom phrase in New York and San Francisco.

The Act’s bite is sharpest where the risk is highest. High‑risk AI systems must be wrapped in an internal discipline familiar from product safety and financial control: a lifecycle risk management system; data governance practices; technical documentation; record‑keeping and logs; transparency sufficient for proper use; human oversight; and an appropriate level of accuracy, robustness and cybersecurity. Providers must also be able to demonstrate compliance—often via conformity assessments – before systems are placed on the market or put into service.

This is where the Act begins to rearrange the corporate wiring. A recent paper on the AI Act’s “silent impact” argues that dense technical obligations do not land neatly on “the company” in the abstract. They land on people—often the ones who do not give keynote speeches. Board secretaries (the corporate secretariat), compliance officers and in‑house counsel become the translators between statutory ambition and operational reality.

Consider the corporate secretariat, usually viewed as the custodian of process. Under the AI Act, the process is the point. The board must be able to show it understands where AI is used, what it does, what could go wrong, and what controls exist. The paper sketches a role for board secretaries that looks less like administrative support and more like governance engineering: embedding AI into enterprise risk management, coordinating conformity and documentation pathways, designing incident escalation, and aligning AI disclosures with broader reporting.

In a world where litigation and regulation feast on “unknown unknowns”, the minutes matter more than ever.

The compliance function, meanwhile, is asked to do what it does best: turn principles into checklists, and checklists into evidence. But the Act moves compliance upstream. Risk management is not something performed after deployment; it is “established, implemented, documented and maintained” throughout the lifecycle. Post‑market monitoring – collecting and analysing performance data to ensure continuing compliance – becomes a structured requirement, not a nice‑to‑have. The attached paper describes compliance officers as facing heightened duties to build robust frameworks, run impact assessments and handle regulatory reporting.

That is another way of saying: more paperwork, yes, but also more authority – if the organisation allows it.

In‑house counsel, for their part, inherit the messy borderlands: liability allocation, contractual safeguards, disclosure risk and the perennial mismatch between what regulators want disclosed and what firms want kept secret.

When AI is bought from vendors – as it often is – Legal’s work shifts from negotiating price to negotiating proof: warranties of compliance, audit rights, cooperation duties and incident reporting clauses.

The Act turns “papering the file” into something closer to product governance.

Technology teams will not escape, of course. They are the ones who must build systems that can be explained, logged, overseen and secured. Yet the biggest change for engineers may be cultural: they are being asked to develop AI like safety‑critical software, even when the product looks more like a website feature than a medical device.

Then there is the data problem, which is really a management problem. Article 10’s insistence on relevant, representative datasets and on bias detection and mitigation will collide with the reality of historical data: incomplete records, skewed outcomes, and the fact that “representativeness” is often in the eye of the regulator. This pulls data governance and privacy teams into the centre, and it forces tighter coordination with counsel and compliance.

Why will companies take this seriously? Because Europe has made non‑compliance expensive. For certain infringements – such as prohibited AI practices – the Act contemplates administrative fines up to EUR 35m or 7% of global turnover (whichever is higher). That is large enough to be noticed by boards, insurers and auditors, which is the EU’s preferred method of making rules real.

The politics around the Act remain fluid. The European Commission has recently proposed delaying parts of the stricter “high‑risk” regime to December 2027, amid pressure to cut red tape – though this still needs to run the legislative gauntlet. In parallel, guidance efforts for advanced models continue. The practical implication for firms is paradoxical: the timeline may wobble, but the direction is set. The EU has built a template others will borrow, and it is doing so with a regulator’s instinct for process.

The most interesting consequence, then, is not that companies will hire more lawyers (they will). It is that AI will stop being treated as a clever project and start being treated as a governed system – something inventory‑listed, risk‑rated, monitored, audited and minuted. The AI Act makes this organisational shift unavoidable. For firms that embrace it, compliance may become a competitive advantage: fewer unpleasant surprises, more defensible products, and a clearer story for regulators and customers. For those that do not, Europe will offer an education – priced by percentage point.

Source: AIMG Research