Risk & Economy » Compliance » AI regulation moves closer to the finance function

AI regulation moves closer to the finance function

AI is transforming corporate finance, but new regulation is closing in. The EU AI Act and UK regulator guidance mean CFOs must prepare for stricter governance, transparency, and accountability around the use of AI in forecasting, compliance, and reporting.

For years, CFOs have overseen compliance with accounting, tax, and corporate governance rules. Now, artificial intelligence is joining the list of regulatory priorities. With the EU AI Act formally adopted in 2024 and UK regulators signalling similar oversight, finance leaders will soon face new obligations around how AI is used in decision-making, risk management, and reporting.

The EU AI Act sets the tone

The EU AI Act, described as the world’s first comprehensive AI law, introduces a risk-based framework for AI systems. Applications deemed “high-risk” — including those used in credit scoring, fraud detection, and employee management — will face strict transparency, governance, and accountability requirements. For companies with operations or customers in the EU, compliance will not be optional.

While much of the focus has been on the technology sector, the rules cut across industries, including financial services and corporate finance functions. CFOs will need to understand where AI is embedded in forecasting models, payment systems, or supply chain monitoring, and ensure those tools meet regulatory standards.

UK regulators take their own path

The UK has opted against a single AI law, instead mandating existing regulators to provide guidance. The Financial Conduct Authority (FCA) and Prudential Regulation Authority (PRA) have both flagged AI as a supervisory priority, with particular attention to algorithmic decision-making, model governance, and explainability.

For CFOs, this means increased scrutiny of how AI influences financial reporting, credit assessments, and compliance processes. The Treasury has also confirmed that sector regulators will be expected to enforce standards as AI adoption accelerates.

Why it matters for CFOs

AI is already shaping corporate finance functions, from predictive analytics in forecasting to automated expense management and fraud prevention. The attraction is clear: lower costs, faster insights, and stronger controls. Yet regulators are concerned about bias, opacity, and overreliance on black-box systems.

For finance leaders, the emerging regulatory environment introduces both risks and responsibilities. Key areas include:

  • Governance: CFOs will need frameworks to demonstrate oversight of AI models used in finance and operations.
  • Transparency: Systems influencing financial decisions may need to be explainable to auditors and regulators.
  • Vendor management: Outsourced tools and platforms using AI will fall under compliance obligations, requiring more stringent due diligence.

Preparing for what comes next

The timeline for enforcement varies: the EU AI Act will apply in stages between 2025 and 2026, while the UK’s regulator-led approach will evolve gradually. But CFOs should not wait. Early preparation will mean mapping where AI is in use, establishing accountability lines, and embedding risk assessments into finance operations.

The regulatory perimeter may be shifting, but the core challenge for CFOs remains familiar: balancing innovation with compliance. AI promises efficiency and insight, but it is clear that regulators want CFOs firmly in control of how it is deployed.

Share

Comments are closed.