AI Compliance Checklist for CPA Firms: The 5 Obligations in 2026
Published May 12, 2026 · Updated April 2026 · By The Crossing Report · 18 min read
The 5-Step AI Compliance Checklist for CPA Firms
CPA and accounting firms face more overlapping AI compliance obligations than virtually any other professional services firm in 2026. AICPA professional standards, IRS preparer rules, state AI laws, GLBA data security requirements, and professional liability exposure all converge — simultaneously. Here is the 5-step checklist:
- AICPA Standards — Confirm competence and supervision obligations are met for every AI tool used in client engagements
- IRS Preparer Rules — Verify paid preparer signature and due diligence requirements are satisfied for all AI-assisted returns
- State AI Laws — Assess Colorado ADMT, Washington HB 2225, and Oregon SB 1546 applicability for your client base
- Client Data Security — Review AI vendor data flows against GLBA obligations; update vendor agreements
- Professional Liability — Check E&O policy language for AI carve-outs; update engagement letters with AI disclosure clause
The sections below cover each obligation in full. For the broader state-law compliance picture across all professional services firms, see our AI regulation compliance hub for professional services firms.
Why CPA Firms Face More AI Compliance Layers Than Other Professions
An 8-attorney law firm using AI faces one primary compliance layer: the ABA and state bar ethics rules. A 15-person consulting firm faces state AI employment laws and professional liability considerations. A CPA firm using AI faces all of the following, simultaneously:
- AICPA professional standards — including the Statements on Standards for Tax Services and the Code of Professional Conduct, both of which apply to AI-assisted engagements
- IRS rules — paid preparer regulations and Circular 230 obligations that govern AI use in tax preparation
- State AI laws — Colorado ADMT, Oregon SB 1546, Washington HB 2225, and others that apply based on your client geography
- GLBA data security — CPA firms are financial institutions under the Gramm-Leach-Bliley Act; client financial data processed by AI vendors requires documented security governance
- Professional liability — E&O coverage that may include AI carve-outs your carrier has not told you about
No other professional services firm category faces that combination. Law firms don't have GLBA. Consulting firms don't have AICPA standards. Staffing firms don't have IRS preparer rules. Accounting firm owners adopting AI in 2026 are navigating a compliance environment more complex than almost anyone giving them generic AI advice understands.
This guide walks through each obligation specifically. It is not legal advice — it is an operations brief for firm owners who need to know what to do, not just what exists.
Obligation 1 — AICPA Professional Standards
AICPA Statement on Tax Standards and AI-Generated Returns
The Statements on Standards for Tax Services (SSTSs) are the AICPA's authoritative guidance on CPA conduct in tax engagements. They have not been amended to address AI, and they don't need to be — the existing standards already govern AI use clearly.
SSTS No. 1 requires that the CPA "not recommend a position that the member knows has no realistic possibility of being sustained." SSTS No. 6 covers the signing CPA's responsibilities for tax returns prepared by others. Neither standard creates an exception for AI-generated work. The signing CPA is responsible for the accuracy of every return they sign — including returns where the initial draft was produced by AI.
In practical terms: if you use an AI tool to prepare a first-pass return and a staff accountant reviews and signs it, the signing accountant is fully responsible for its accuracy. The AI's role in the preparation is legally irrelevant to the standard of care analysis. This is not a theoretical liability scenario — it is the existing rule applied to new tools.
The immediate implication: your firm needs a documented review protocol specifically for AI-assisted returns. Who reviews AI output? What does the review cover? How is the review documented? These questions need answers before your first AI-assisted return is signed.
AICPA AI Ethics Framework (2025–2026 Guidance)
The AICPA has issued guidance on AI in the profession through its ethics and professional standards framework. The core obligations come down to two principles that every CPA firm owner needs to understand before deploying AI in client work.
Competence. The AICPA Code of Professional Conduct requires CPAs to maintain competence in the tools and methods they use. For AI tools, this means understanding what the tool does, how it generates outputs, and where its limitations lie — well enough to supervise its work. You do not need to be a machine learning engineer. You do need to understand whether the AI tool you're using for tax research is trained on current tax law, how it handles ambiguous fact patterns, and how to recognize when its output is wrong.
The competence obligation has a practical implication for firm owners: deploying AI in client engagements before your staff understands how to use it correctly is an AICPA violation waiting to happen. Training is not optional.
Independence. For attest clients, independence rules create additional complexity around AI vendor relationships. AI vendors that process your attest clients' financial data may trigger independence considerations. Specifically: if an AI vendor's tool accesses your attest client's financial records as part of your engagement workflow, the independence analysis for that client needs to include that vendor relationship. The standard didn't change — the tool did. Work through this with your engagement partners before deploying AI in audit or review engagements.
Obligation 2 — IRS Rules on AI in Tax Preparation
The Paid Preparer Signature Rule Still Applies
IRS regulations on paid preparers were written before AI existed. They do not need to be rewritten to cover AI — the existing rules already do.
The paid preparer of record is the individual who prepares or substantially assists in the preparation of a return. That person must sign the return, provide their PTIN, and accept responsibility for its accuracy. There is no provision in IRS regulations for delegating preparer responsibility to a software system, automated tool, or AI.
Form 8867 (Paid Preparer's Earned Income Credit Checklist) and the analogous due diligence requirements for the Child Tax Credit, American Opportunity Credit, and Head of Household status all require the preparer to verify the accuracy of eligibility determinations. The due diligence obligation is on the preparer — not the tool. If AI flags a client as eligible for EITC and the preparer signs without independent verification, the preparer bears the due diligence failure if that determination is wrong.
This is the compliance posture that IRS has maintained through multiple rounds of automated tax preparation technology. AI is not different. The human preparer is responsible.
The Gap in IRS Guidance
As of Q1 2026, the IRS has not issued a revenue procedure, notice, or guidance document specifically addressing AI-generated tax returns. This is a meaningful gap — and not a permissive one.
The absence of AI-specific IRS guidance does not mean existing rules don't apply. It means existing rules apply without interpretation. Circular 230 Section 10.22 imposes a due diligence standard on all practitioners: you must make reasonable inquiries if information furnished appears to be incorrect, inconsistent, or incomplete. That standard applies equally to AI-generated information appearing in a client's return.
Revenue Procedure 2021-40, which governs electronic filing requirements, does not address AI preparation specifically. But it requires that e-filed returns accurately represent the authorized return — an authorization the human preparer must provide. Automated filing of an AI-prepared return without preparer review is not compliant with the e-file authorization requirements regardless of what the AI produced.
The practical gap: CPA firms don't have IRS guidance to point to for AI use the way law firms can point to ABA Formal Opinion 512. The IRS is likely to fill this gap before the end of the 2026 tax season, but until it does, firms should operate under the assumption that the existing paid preparer framework governs fully. For the latest analysis of this gap and what it means for CPA firms specifically, see our analysis of IRS guidance on AI tax preparation.
Obligation 3 — State AI Laws That Hit Accounting Firms
Colorado ADMT — June 30, 2026 Deadline
Colorado's Algorithmic Decision Management Technology framework (SB 25-318) is the only US state AI law with an imminent deadline that directly affects professional services firms. The June 30, 2026 effective date applies.
The ADMT covers "high-risk AI systems" that make "consequential decisions" — decisions with a significant effect on individuals in domains including financial services, employment, credit, and healthcare. For accounting firms, the relevant question is whether the AI tools you use make consequential decisions about clients in those domains.
Tax preparation tools that generate automated return outputs, AI-assisted financial planning tools that generate investment or savings recommendations, and audit tools that automatically flag client transactions may qualify as high-risk AI systems under this definition. The analysis is tool-specific.
The small business exemption covers accounting firms with fewer than 50 full-time employees that use commercially available, unmodified AI tools. If your firm qualifies, the exemption protects you from most ADMT obligations — but the exemption is not automatic. You need to:
- Confirm your FTE count is below 50
- Identify which AI tools you use in consequential decision contexts
- Confirm those tools are commercially developed and not customized by your firm
- Document this qualification in writing — a one-page memo is sufficient
For firms over 50 FTEs, or firms that have configured or customized AI tools for their practice, the ADMT requires impact assessments for each covered AI system, disclosure to affected individuals, and annual review. The full Colorado compliance walkthrough is available in our Colorado ADMT compliance guide for professional services firms.
State Chatbot Disclosure Laws
Three state laws are directly relevant to accounting firm AI communications in 2026.
Oregon SB 1546 (effective January 1, 2026) requires disclosure when anyone interacts with AI rather than a human in a commercial context. For accounting firms: AI-powered intake chatbots on your website, automated scheduling tools that communicate with clients, AI-generated follow-up emails that could be mistaken for human responses — all require disclosure. The law carries a private right of action, meaning clients don't need to wait for Oregon's attorney general to enforce it. They can sue directly. A two-sentence disclosure in the relevant touchpoint is the fix.
Washington HB 2225 (effective June 22, 2026) imposes similar AI transparency requirements. If you have Washington-based clients or a Washington presence, your client-facing AI communications need to be compliant by June 22.
Georgia SB 540 (pending Governor's signature as of early May 2026) would create a June 2026 effective date for chatbot disclosure in Georgia. If your firm has Georgia clients, monitor this law's final status.
The full state-by-state tracker for these laws and their coverage dates is in our state AI chatbot laws compliance guide for professional services.
The Crossing Report delivers weekly AI intelligence for professional services firm owners — what's changing, what it means for your firm, and what to do next. Subscribe free →
Obligation 4 — Client Data Security Under AI
Gramm-Leach-Blitta and AI Vendor Data Flows
CPA firms are financial institutions under the Gramm-Leach-Bliley Act. This classification creates obligations that most accounting firm owners know apply to their data practices — but don't always apply to AI vendor relationships specifically.
GLBA's Safeguards Rule requires covered financial institutions to implement and maintain a comprehensive information security program. That program must cover service provider oversight: when a third-party vendor accesses your client financial data, you are responsible for ensuring they protect it appropriately. AI vendors are third-party service providers under the Safeguards Rule.
If your team is entering client Social Security numbers, income data, financial statements, or tax records into an AI tool, that AI vendor is processing covered financial data. Your GLBA compliance requires:
- A written vendor agreement that specifies how the vendor will protect that data
- A reasonable security assessment of the vendor's practices
- Contractual provisions requiring the vendor to notify you of security incidents
This does not mean you cannot use AI. It means you need vendor agreements in place before your staff enters client data into AI tools. Most major AI platforms (OpenAI, Anthropic, Microsoft) offer enterprise agreements that include data processing and security commitments. Consumer-tier accounts — the $20/month ChatGPT plan — typically do not.
The practical step: conduct an AI tool inventory (see Obligation 5 for the template), identify which tools process client financial data, and confirm you have executed data processing agreements with those vendors.
SOC 2 Considerations When Using AI Tools
If your firm holds a SOC 2 certification or provides services where SOC 2 is relevant to your clients, adding AI tools to your practice affects your SOC 2 scope.
Cloud AI tools that process client data are in scope for your SOC 2 engagement if they interact with client systems or data covered by the trust services criteria. A tax preparation AI tool that connects to a client's accounting software is within scope. A general-purpose AI assistant used only for internal drafting is likely outside scope — but document that decision.
Your SOC 2 auditor needs to know which AI tools you've added to your practice and how you've assessed their security posture. Disclosing this proactively — rather than having the auditor discover new AI tool use during fieldwork — is the right approach.
For broader context on data security governance when using AI, including law firm parallels that apply equally to accounting firms, see our AI data security guide for professional services firms.
Obligation 5 — Professional Liability and Malpractice Risk
Does AI Malpractice Insurance Cover CPA Firms?
The standard Errors & Omissions policy for accounting firms was written before AI-assisted work product was a common scenario. That means the AI coverage question — does your E&O policy cover a claim arising from an AI-generated error in a client deliverable? — is not answered in the policy language most firms are renewing with.
Professional liability carriers for accountants are handling this in two ways. Some are adding AI endorsements that explicitly confirm coverage for AI-assisted work, subject to the firm maintaining documented review protocols. Others are adding AI carve-outs or exclusions — provisions that specifically exclude coverage for claims arising from AI-generated work product that was not reviewed by a licensed professional. The problem: most firms don't know which type they have until they need to file a claim.
The action item is simple: pull your current E&O policy and read the exclusions section. Call your broker and ask directly: "Does my policy cover professional liability claims arising from AI-assisted work product, and are there conditions or exclusions I should know about?" Do this before renewal, not after a claim.
Firms that maintain documented AI governance — a written AI usage policy, review requirements, staff training records — are consistently in a better position at renewal than firms that cannot demonstrate any governance. Carriers are asking these questions at renewal now.
Engagement Letter Language for AI-Assisted Work
Adding an AI disclosure clause to your engagement letters is both best practice and, in some states, required. The AICPA's guidance on client communications recommends transparency about AI use in engagements. Oregon SB 1546 creates legal disclosure requirements for AI-assisted client communications. And from a professional liability perspective, an engagement letter that discloses AI use and confirms professional supervision is a documented defense.
The minimum AI engagement letter clause for CPA firms has two components:
Disclosure statement: "The firm may use artificial intelligence-assisted tools in the delivery of services, including in tax preparation, financial analysis, research, and document drafting."
Supervision statement: "All outputs generated by AI-assisted tools are reviewed and supervised by licensed CPAs before delivery to clients. The professional judgment and responsibility of the firm's licensed staff governs all work product."
Customize the first statement to name the specific AI categories you use (tax preparation AI, general-purpose AI assistants, audit software AI). The supervision statement should be verbatim or substantially similar — it is the language that satisfies both the AICPA's competence and supervision expectations and the professional liability defense.
For the full engagement letter framework including firm-type-specific customization, see our AI engagement letter compliance guide.
The 5-Step AI Compliance Audit for CPA Firms
This is the operating checklist. Work through it once, document your results, and schedule a quarterly review.
Step 1: Inventory all AI tools in use. List every AI tool your firm uses — tax preparation AI, general-purpose AI assistants, audit software with AI features, document review tools, scheduling and intake tools. For each tool, record: the vendor, the use case, whether it processes client financial data, and when it was added to firm use. This is your AI tool inventory. A spreadsheet with five columns takes 15 minutes to create and satisfies the documentation requirements for Colorado ADMT exemption qualification, GLBA vendor oversight, and E&O policy disclosure.
Step 2: Check AICPA guidance on competence for each tool category. For every AI tool that touches client work, confirm that the staff using it can explain how it works well enough to supervise its output. This doesn't require engineering knowledge — it requires that your accountants understand the tool's limitations and can recognize when its output is wrong. If they can't, training is required before the tool is used in client engagements.
Step 3: Verify IRS preparer responsibility requirements are met for AI-assisted returns. Confirm that every AI-assisted return is reviewed by the signing preparer before filing, that Form 8867 and comparable due diligence requirements are verified by a human, and that your firm has a documented review protocol for AI-prepared returns. The protocol doesn't need to be elaborate — a one-page checklist that the signing CPA completes and files with the return workpapers satisfies the documentation requirement.
Step 4: Assess state AI law applicability for your client base. Run your client geography against the state AI laws currently in effect. Colorado ADMT applies to consequential decisions affecting Colorado residents — if you have Colorado clients, document your ADMT analysis. Oregon SB 1546 applies to AI-assisted client communications — if you use AI in client intake or communications, add disclosure language. Washington HB 2225 applies June 22, 2026. Check Georgia SB 540 status. If your client base is concentrated in a few states, this analysis takes under an hour.
Step 5: Review client data flows with AI vendors — update engagement letters. Match your AI tool inventory against your GLBA vendor oversight requirements. For every AI tool processing client financial data, confirm you have an executed data processing agreement with the vendor. Update your standard engagement letter to include the AI disclosure clause. Schedule a review of your E&O policy with your broker before your next renewal date.
FAQ — AI Compliance for CPA and Accounting Firms
Do AICPA professional standards apply to AI-generated tax returns?
Yes. The SSTSs require the signing CPA to review and take responsibility for accuracy regardless of whether AI generated the initial draft. The AICPA has not prohibited AI use, but competence and supervision obligations apply — and the signing preparer is responsible for every return they sign. AI does not transfer that liability.
Does IRS have specific rules about AI in tax preparation?
As of Q1 2026, the IRS has not issued direct guidance specifically addressing AI tax prep. However, existing paid preparer rules apply in full: the preparer of record is legally responsible for accuracy, Form 8867 due diligence requirements apply regardless of how the return was prepared, and the preparer signature cannot be delegated to an AI system. The IRS has increased scrutiny of AI-assisted returns. See our analysis of the IRS guidance gap for CPA firms for the full picture.
Does Colorado ADMT apply to my accounting firm?
Colorado's ADMT applies to "consequential decisions" affecting Colorado residents — client financial advisory recommendations generated by AI may qualify. The June 30, 2026 deadline applies. For firms under 50 FTEs using commercial, unmodified AI, the small business exemption likely applies — but must be documented. See our Colorado ADMT compliance guide for the documentation steps.
Do state chatbot disclosure laws affect CPA firms?
Yes, if your firm uses client-facing AI for intake, scheduling, or FAQ functions. Washington HB 2225 (June 22, 2026) and Oregon SB 1546 (January 2026) apply to accounting firms. Oregon's law carries a private right of action. The full tracker is in our state AI chatbot laws guide.
What engagement letter language should I add for AI use?
Two elements: a disclosure statement (naming AI use in your service delivery) and a supervision statement (confirming licensed CPA review of all AI outputs before client delivery). This satisfies AICPA guidance, IRS Circular 230 obligations, and state disclosure requirements. See our AI engagement letter compliance guide for exact clause language.
Is there an AI malpractice risk for CPA firms?
Yes. AI errors in client deliverables create professional liability exposure under the existing standard of care — the professional is responsible for the work product regardless of how it was prepared. Standard E&O policies may include AI carve-outs. Review your policy language with your broker before your next renewal, and confirm your coverage explicitly extends to AI-assisted work product.
The Crossing Report covers AI adoption for professional services firm owners. This guide reflects the regulatory landscape as of May 2026 and will be updated as AICPA guidance, IRS rules, and state AI laws evolve.
This is the kind of intelligence premium subscribers get every week.
Deep analysis, cross-sector patterns, and the frameworks that help professional services firms make the crossing.
Related Reading
- The IRS Has No Rules for AI Tax Prep — And That's Your Firm's Problem, Not Theirs
- The $45-Per-Return Benchmark: What Juno and TaxGPT Mean for Your Tax Prep Pricing
- 88% of Accountants Think AI Is the Most Important Technology in History. Only 8% Are Ready for It.
- PantherAccounting Plus: Can PracticePanther Finally Replace QuickBooks for Law Firm Accounting?
- The Seven-Step AI Workflow That Cuts IRS Notice Resolution in Half
- Your Staff Is Already Using AI With Client Data. Here's What to Do About It.
- The IRS Already Knows What Your Clients 'Should' Be Spending — And Flags the Outliers
- The Day Your Professional License Became an AI Question
- The IRS Is Now Using AI to Select Audits — What Every CPA Firm Owner Needs to Know This Tax Season
- QuickBooks Is Retiring Its Free Firm Tool — The Migration Checklist for Accounting Firms
- The Official Internal Controls Framework for AI Is Here — What COSO's New Guidance Means for CPA Firm Owners
- The Revenue Per Employee Number Every Accounting Firm Owner Should Know
- The 7 Accounting Tasks Most Likely to Be Automated By Year-End — and the 3 That Won't
- The EU AI Deadline Most US Firms Are Ignoring — And Why It Matters Even If You Have 3 European Clients
- PwC Expects End-to-End AI Audit Automation by Year-End — 3 Moves for Small CPA Firms