Your Auditor and AI

Your Auditor Is Using AI. Here's What the FRC's New Guidance Actually Means for Your Business

On 30 March 2026, the Financial Reporting Council published guidance on how audit firms should use generative and other emerging AI tools in their work. If you are a business that goes through an audit, this is worth understanding, even if the technical detail is not your concern.

The short version: AI is coming to the audit room, the regulator has set out clear expectations for how it should be used, and the responsibility for getting it right stays firmly with your audit firm and the partner leading your engagement.

What AI Is Actually Being Used for in Audits

Audit firms have been quietly building AI tools into their workflows for some time now. The FRC's new guidance brings that into the open and sets a framework for doing it responsibly.

Examples of potential applications include summarising board minutes and reviewing contracts for revenue recognition testing. These are tasks that would traditionally take a junior auditor several hours. AI can work through them far faster, freeing up the audit team to focus on judgment-intensive work rather than mechanical processing.

Other applications include transaction analysis, supporting aspects of analysis relevant to areas such as going concern, and documentation review. The common thread is that AI handles volume and pattern recognition, while human professionals are expected to interpret the outputs, exercise judgment, and take responsibility for conclusions.

The Accountability Principle Stays the Same

The most important message in the FRC's guidance is also the simplest: technology changes, but accountability does not.

No matter how sophisticated the AI tools an audit firm uses, the Responsible Individual (the audit partner signing off your accounts) remains fully accountable for audit quality. This applies even where AI-generated outputs contain errors or unreliable outputs (often referred to as ‘hallucinations’), data distortions, or cases where the AI simply misapplied a standard.

The FRC is explicit: audit firms cannot shift blame to a tool. If an AI system produces a flawed output and the engagement team accepts it without adequate human review, that is a quality failure by the firm, not a technology failure.

For business owners and finance leaders, this should be reassuring. The professional judgment and accountability you expect from your audit firm have not been delegated to a machine.

What to Expect From Your Audit Firm

The FRC's framework for responsible AI use in audit rests on four elements: system design and controls, validation and appropriate oversight of the tools being used, staff training and governance, and human review at key points. Well-run audit firms should be able to explain how each of these applies to their work.

If your firm is using AI tools, you should expect the engagement team to remain in control of the process. AI-assisted audit work should be reviewed by qualified professionals before any conclusions are drawn. The audit partner leading your engagement should be able to explain, at a high level, how technology is being used on your audit and what oversight is in place.

At Black Maple, our audit and assurance work is partner-led by design. Our approach ensures that any use of technology is subject to partner-level review and tailored to your specific risk profile, and the judgment calls that matter for your business. Technology plays a role in how we work efficiently, but it does not replace the professional oversight that sits at the heart of a quality audit.

Questions Worth Asking

If you are about to start an audit cycle or are reviewing your current arrangements, these are reasonable questions to raise with your audit firm:

  • Are you using AI tools on our engagement, and if so, what for?

  • How do you review AI-generated outputs before they inform audit conclusions?

  • Who is responsible for the quality of AI-assisted work on our audit?

  • How do you handle situations where an AI output turns out to be wrong?

A firm that can answer these clearly and confidently is one that has thought carefully about how it adopts new technology. Hesitation or vagueness on these points is itself informative.

The Bigger Picture

The FRC's guidance reflects a broader shift in professional services. AI tools offer real efficiency gains and can improve the consistency and depth of audit procedures when used well. But the profession is still at an early stage of working out how to govern this properly.

The firms that will do this best are those that treat AI as a complement to professional judgment, not a replacement for it. The FRC has set a clear expectation on that front. The question now is how consistently it is met across the profession.

If you would like to talk through what this means for your business and what good audit practice looks like in an AI-assisted world, get in touch with the Black Maple team.

‍ ‍

Next
Next

Fractional CFO Services: A Smarter Way to Scale Your Finance Function