Training accounting staff to use AI effectively means building three capabilities: understanding what AI tools can and cannot do reliably, knowing how to review AI outputs to a professional standard, and developing the prompting skills to get consistent, useful results. Without all three, AI tools tend to be underused, misused, or both.
This guide sets out a practical training approach for accounting practices of any size, from sole practitioners through to multi-partner firms.
Why AI training is a professional requirement
The joint PCRT guidance published by the UK accounting bodies in January 2026 includes professional competence as one of the five principles governing AI use. The principle is straightforward: you should only use AI tools you understand sufficiently to review and evaluate the outputs.
This has a direct implication for training. If a member of staff uses an AI tool to produce client work without understanding what the tool does, what its failure modes are, and what they should check, they are not meeting the professional competence standard. The firm that deployed the tool without providing adequate training has created that risk.
Training accounting staff on AI is not optional. It is a professional obligation embedded in the ethical framework.
Assessing your starting point
Before designing training, assess where your team currently stands. A short survey or one-to-one conversations can establish:
- Which AI tools team members are already using (officially or unofficially)
- What they use them for and how confident they feel
- What they do not understand about how AI works
- Where they have had problems or near-misses with AI outputs
This assessment will reveal the variation across your team. Typically you will find some staff who are already proficient and using AI productively, some who are curious but uncertain how to start, and some who are sceptical or resistant. Training design should address all three groups differently.
It will also reveal shadow AI use — staff using personal AI tools on client data without formal authorisation. This needs to be addressed through training and policy, not just prohibition.
Core training content
Every member of staff who uses or may use AI tools in their work should receive training on the following:
How AI works (briefly)
Staff do not need a technical education in machine learning. They do need a working mental model: AI language models generate responses based on patterns in training data, not on real-time knowledge or verified facts. They produce confident text regardless of accuracy. They do not understand context the way a human reader does.
A ten-minute explanation of this — using an example of AI confidently stating a wrong tax rate or inventing a piece of legislation — is usually enough to calibrate staff expectations. The goal is not to make them sceptical of all AI output, but to make them appropriately critical rather than credulous.
Data privacy rules
Every staff member must understand which categories of client data may be used with which AI tools, and why. The rules should be simple and memorable: business-grade tools with signed DPAs are approved for specific uses; consumer or free tools are not to be used with client data; any uncertainty should be escalated before the data is shared, not after.
Explain why the rules exist in terms staff can understand: using the wrong tool could expose client financial data, create a GDPR breach, and result in regulatory investigation of the firm.
Review requirements by work category
Staff need to know what review is required before AI output is used, and what specifically they are reviewing for. Generic guidance to "check the output" is not sufficient. Effective review training specifies:
- What categories of error are most common in this type of AI output
- What sources to check specific claims against
- What escalation is required before the output is used or issued
Provide worked examples: show a piece of AI-drafted correspondence with errors embedded, and run through what should be caught in review and why.
Prompting skills
The quality of AI output depends significantly on the quality of the prompt. Staff who know how to write clear, specific prompts get better results with fewer corrections. Basic prompting skills can be taught in an hour and should cover:
- Being specific about the purpose, audience, and required format
- Providing all relevant context in the prompt rather than expecting AI to infer it
- Specifying any constraints (length, tone, inclusions, exclusions)
- How to iterate: if the first response is not right, how to refine the prompt
- When not to use AI: tasks that require real-time information, highly specific client calculations, or complex professional judgement
Training format and delivery
For most accounting practices, a two-to-three hour initial training session per cohort, delivered in small groups, is the most effective format. Sessions are more useful when they are hands-on — staff practising with the actual tools they will use on representative tasks — rather than slide-heavy presentations.
Structure the session in three parts: understanding AI limitations (30 minutes), data privacy and your firm's policies (30 minutes), and hands-on practice with the specific tools the firm uses (60 to 90 minutes). End with a short Q&A covering the edge cases staff are uncertain about.
Follow up the initial session with:
- Written guidance for reference: a one-page summary of approved tools, review requirements, and escalation contacts
- A feedback channel for questions that arise during day-to-day use
- Brief updates (15 to 30 minutes) when policies change, new tools are introduced, or the regulatory environment shifts
For new starters, AI training should be part of induction rather than a separate event scheduled weeks later.
Training senior staff and partners
Partners and senior staff often have the most influence on how AI is adopted in a practice, but are sometimes the least engaged with training. They may already have formed habits around AI tools — some good, some not — and may be resistant to a training format that feels junior.
Address this by framing partner-level training differently: not as instruction in how to use tools, but as a governance briefing on the firm's obligations and the risk framework. Cover: what the PCRT guidance requires, what the firm's data privacy obligations are, what good oversight looks like, and how to set appropriate expectations for team members doing AI-assisted work.
Partners who understand the professional risk framework are better placed to set the tone and enforce standards than those who simply defer to "the young ones are good at technology."
For a broader view of AI governance and culture in accounting practices, see our AI tools and technology for UK accountants hub.
Measuring training effectiveness
Training effectiveness can be measured in three ways:
Knowledge assessment: a short quiz at the end of training covering the key policies and review requirements. Not a formal examination — a tool to identify gaps that need reinforcement.
Error tracking: monitor the rate of AI-related errors reaching clients or requiring correction in the months after training, and compare to the baseline before training. Improving error rates indicate the training is working.
Confidence surveys: a brief survey three months after training asking staff how confident they feel using AI tools correctly and whether they know what to do when uncertain. Low confidence scores indicate either that training needs reinforcing or that processes are not clear enough.
Building a learning culture around AI
AI capabilities are developing quickly. Training delivered today will need updating within twelve to eighteen months as tools evolve, professional guidance is refined, and the firm's own AI use expands.
Build a culture where staff are expected to stay current with AI developments relevant to their work, share useful discoveries with the team, and flag concerns or near-misses without fear of criticism. The most valuable AI learning in a practice often comes from team members who experiment, discover what works, and share it — not from formal training programmes alone.
Designate a person (or small group) to track AI developments in accountancy and bring relevant updates to the team quarterly. Professional body publications, ICAEW's technology resources, and sector-specific AI coverage all contain material that is directly relevant to UK accounting practices.
Key takeaways
- AI training is a professional obligation under PCRT (January 2026) — using AI tools without adequate competence to review outputs is an ethical breach.
- Train every staff member on the same four core areas: how AI works, data privacy rules, review requirements by work category, and prompting skills.
- Hands-on practice with the firm's actual tools is more effective than slide presentations — allocate at least 60 minutes of the initial session to practical exercises.
- Partner-level training should be framed as a governance briefing rather than tool instruction — focus on obligations, risk, and oversight standards.
- Build ongoing learning mechanisms — quarterly updates, a feedback channel, and a named person tracking AI developments — so training stays current as tools and guidance evolve.
Frequently asked questions
How long does it take to train accounting staff to use AI tools competently?
Most staff can reach a working level of competence in two to three hours of initial training, provided the training is hands-on and focused on the specific tools and workflows they will use. Full proficiency — consistently producing good prompts, conducting effective reviews, and knowing when not to use AI — typically develops over the first one to three months of regular use. Plan for a brief refresher session at three months to address questions that have arisen from real-world use.
Should junior and senior staff receive the same AI training?
The core content should be the same — everyone needs to understand AI limitations, data privacy rules, and review requirements. The delivery should be adapted to the audience. Junior staff benefit from more structured guidance on review criteria and escalation. Senior staff benefit from deeper treatment of professional governance obligations and their role in setting standards for the team. Partners need a governance-focused briefing on risk and oversight.
What happens if a staff member uses an unauthorised AI tool on client data?
Address it promptly and consistently. First, assess the extent of any data protection issue and notify the affected clients and the ICO if required. Second, retrain the staff member on data privacy rules and the approved tools policy. Third, review whether your training was sufficiently clear and whether the policy was communicated effectively. Repeated breaches require more formal action. The response should be proportionate — a first-time mistake by someone who did not understand the rules is different from a deliberate disregard of clear policy.
Do the professional bodies provide AI training resources for accountants?
Yes. ICAEW, ACCA, and CIOT all publish guidance and resources on AI in practice. ICAEW has a dedicated technology in practice section covering AI tools, ethics, and GDPR. ACCA has produced practical AI guides for practitioners. The joint PCRT guidance (January 2026) is a foundational resource that all staff should be familiar with. These resources can supplement internal training but should not replace hands-on practice with your firm's specific tools and processes.
How should I handle staff resistance to AI adoption?
Resistance is usually rooted in one of three concerns: fear of job loss, uncertainty about how to use the tools correctly, or scepticism about whether AI actually works. Address each directly. On job loss: explain how the firm plans to redeploy freed capacity — typically toward advisory work or additional clients, not toward redundancy. On uncertainty: invest in genuine training rather than expecting staff to figure it out themselves. On scepticism: demonstrate the tools working well on real tasks with the resisting staff member present, and acknowledge that not every AI tool is suited to every workflow.