The AICPA Just Told Valuation CPAs How to Use AI — Here's the 4-Question Checklist
The AICPA Just Told Valuation CPAs How to Use AI — Here's the 4-Question Checklist
If your accounting firm does business valuations, estate and gift valuations, economic damages analyses, or any forensic work — and you've started using AI tools on those engagements — the AICPA has now established the standard you'll be measured against if something goes wrong.
The AICPA Forensic & Valuation Services (FVS) Executive Committee released its AI guidelines in October 2025. Most small CPA firms missed them. The AICPA's own publication is members-only. The trade press covered it briefly. No one wrote a practical guide for the 10-person regional accounting firm that does valuations alongside its tax practice.
That's what this is.
What the AICPA FVS AI Guidelines Actually Say
The full title: Guidelines for Responsible Use of Artificial Intelligence in Forensic and Valuation Services Engagements. Published by the FVS Executive Committee AI Workgroup, October 2025.
First, the technical framing: these are non-authoritative guidelines, not official professional standards. They don't carry the force of a Statement on Standards for Valuation Services. But that distinction matters less than it sounds.
Here's why: in a malpractice proceeding or professional dispute involving an AI-assisted valuation engagement, the question will be whether your firm used AI responsibly. The AICPA FVS guidelines are what "responsible" now means, at least as far as the AICPA is concerned. When the opposing counsel's expert witness is asked "what should the practitioner have done differently?", the answer will be benchmarked against these guidelines.
Non-authoritative guidance has precedent in professional liability. It's the same way the AICPA's ethics interpretations shape professional conduct without being promulgated as enforceable standards. The courts and arbitrators look at what the profession says responsible behavior looks like. The FVS AI guidelines are now that reference point for valuation and forensic work.
NACVA has published companion guidance — "Follow the Standards When Using AI and Similar Technologies" — that reaches the same conclusions from the valuation practitioner's side. If your FVS practice draws on both AICPA and NACVA credentials, both documents apply.
The four areas the guidelines address: engagement terms alignment, confidentiality agreement compliance, non-delegable professional judgment, and output review protocols. Which maps directly to four questions you need to answer.
The 4 Questions Every Valuation CPA Must Answer Before Using AI on an Engagement
Run this before you open a ChatGPT window, a Claude interface, or any AI-assisted research tool on a client engagement. All four need a "yes" before you proceed.
Question 1: Does your engagement letter address AI use?
If you drafted your standard engagement letter before 2024, it almost certainly doesn't. That means you're using AI on terms that were negotiated before AI existed as an option — the client agreed to a scope of work that neither of you contemplated would include AI tools.
This is a drafting gap, not a disclosure choice. The AICPA FVS guidelines require that AI use aligns with engagement terms. If your engagement terms don't mention AI, there's nothing to align with.
What the updated engagement letter needs to say:
- That AI tools may be used in the course of the engagement
- That the category of tasks where AI is used (research and data gathering, document analysis, draft preparation of sections) is described — not necessarily specific tool names, which can change
- That all AI-assisted analysis is reviewed and verified by a credentialed professional before it is incorporated into final work product
- That the client's information will be handled in AI tools consistent with the firm's confidentiality obligations
Don't use untested language. Have your professional liability counsel review the specific clauses before you put them in your standard template. This is a one-time cost that protects every engagement from now on.
Question 2: Does your confidentiality agreement permit uploading client data to AI tools?
This is the one that catches firms off guard. Many client confidentiality agreements — especially those in business acquisition valuations and litigation support — contain data handling language that predates cloud software. They prohibit "sharing" or "transmitting" client information to third parties. AI tools are third parties under most of those definitions.
Before you upload a client's financial statements, deal documents, or deposition transcripts to an AI tool, you need to confirm:
- Your firm's engagement confidentiality agreement doesn't prohibit it
- The client's own data sharing policies don't prohibit it (common in regulated industries)
- The AI tool's data handling terms are consistent with what you've promised your client
The confidentiality review is particularly important for enterprise or regulated clients. A 15-person accounting firm doing estate valuations for high-net-worth families may have simpler agreements. A firm doing economic damages analyses for commercial litigation has more to check.
If you can't confirm your current confidentiality agreements permit AI tool use — stop. Don't upload client data until you've resolved it. A breach of a confidentiality agreement because an AI tool's data practices conflicted with client terms is a worse outcome than using AI more slowly.
Question 3: Are you maintaining non-delegable professional judgment?
The AICPA is explicit on this one. The guidelines state that the practitioner's professional judgment is non-delegable — meaning you cannot outsource the professional call to the AI model, even a very capable one.
What this means in practice:
AI can draft the narrative for a section of a business valuation report. You cannot submit that draft without applying independent professional judgment to its conclusions. That means reading it, evaluating whether the analysis is sound, verifying factual claims against the underlying data, and reaching your own professional conclusions — not just approving the AI's conclusions because they look reasonable.
AI can identify comparable transactions in a database search. You cannot accept the comparables selection without evaluating whether the AI's selection logic aligns with professional valuation standards for guideline company selection.
AI can summarize financial data from documents. You cannot accept that summary without verifying it against the source documents.
The failure mode the AICPA guidelines warn against: "AI-assisted" becoming "AI-decided." The liability exposure is not AI involvement — it's the practitioner failing to apply independent professional judgment to AI output. In a malpractice proceeding, "I relied on the AI's analysis" is not a defense. "I reviewed the AI's analysis, identified these issues, corrected these errors, and reached my own professional conclusion" is.
Document the review. If you can't show the human judgment step, you can't prove it happened.
Question 4: Do you have an output review protocol?
This is the process question. The engagement letter, the confidentiality review, the professional judgment — all three require an operational protocol that documents how it works in practice.
An output review protocol for AI use in FVS work answers:
- What is the review step? For each category of AI use (research, document analysis, draft preparation), who reviews the output, what do they check, and what does "approved" mean?
- What is the documentation trail? How does the firm record that a licensed professional reviewed and assessed AI output before it was incorporated into client work product?
- What happens when AI output is wrong? A hallucinated comparable transaction, an incorrect financial figure, a misattributed court decision in a litigation support matter — what is the protocol when you catch an AI error?
- Which AI tools are approved? Not every AI tool is appropriate for client work. The firm should have a defined list of approved tools and a clear process for evaluating new tools before they're used on client data.
This protocol doesn't need to be elaborate. It needs to be documented and consistently followed. For a 10-person accounting firm with an FVS practice, it can be a two-page internal policy and a standard field in your file notes confirming AI-assisted sections were reviewed.
The AICPA guidelines don't prescribe a specific format. They establish the requirement. You design the protocol that works for your firm.
The Practical Liability Implication
Here's the stakes in plain terms.
You did a business acquisition valuation for a client. You used AI to help analyze the target company's financial statements, identify comparable transactions, and draft sections of the report. The deal closed. Two years later, the client sues — they allege the valuation was wrong and they overpaid.
In that dispute, the question is not whether AI was used. It's whether you used it responsibly. The opposing expert will look at:
- Whether your engagement letter addressed AI use (it didn't, if you haven't updated it)
- Whether your confidentiality agreement permitted uploading the target's financials to an AI system (unclear, if you haven't checked)
- Whether there is any documentation that a licensed professional reviewed and independently assessed the AI-assisted analysis (there isn't, if you don't have a protocol)
The AICPA FVS guidelines become the standard against which each of those questions is evaluated. Three "no" answers is a documented pattern of non-compliance with the profession's own AI responsibility guidelines. That's the foundation for a liability finding.
This is true for estate and gift valuations. Economic damages analyses. Forensic investigations. Any FVS engagement where AI was used without the four-question checklist having been run first.
The guidelines are less than a year old. The practitioners who read them now and update their practices accordingly are ahead of the wave. The practitioners who encounter them for the first time in a deposition are not.
The 2-Step Action for Firms With FVS Practice Areas
Don't try to do everything at once. Two steps, in order.
Step 1: Update your standard engagement letter with an AI use clause.
This is the highest-leverage single action. Your engagement letter is the document you control. It sets the terms. An AI clause in your standard engagement letter means every new engagement starts from a defensible position.
Draft the clause. Have your professional liability counsel review it. Implement it as your standard template. Then run a quick process check: is the AI clause included in every new engagement letter that goes out? Assign someone to verify it.
For existing long-term engagements — clients you work with on an ongoing basis — consider whether the terms of the relationship should be updated. A brief addendum or amendment to address AI use may be appropriate. Ask your attorney.
Step 2: Audit your current confidentiality agreements against your AI tool stack.
Pull the active confidentiality agreements for your current FVS clients — or your firm's standard confidentiality template if that's what governs. Compare the data handling language against the privacy policies and terms of service for the AI tools your firm uses.
Specifically look for: language that restricts third-party data sharing, language that addresses cloud or SaaS software (most confidentiality agreements now address this), and any client-specific data handling requirements that might conflict with AI tool terms.
If you find a conflict — a confidentiality agreement that restricts data sharing to an extent that would prohibit AI tool use — flag it before you use AI on that engagement. Not after. The time to resolve it is when you can have a conversation with the client about it, not after you've already uploaded their documents.
This audit takes a couple of hours. Do it before the next FVS engagement.
The One Thing to Do This Week
Update your engagement letter template. That's it.
Open your current standard engagement letter. Add four sentences addressing AI use — what may be used, in what categories of work, subject to what review, consistent with existing confidentiality obligations. Send it to your professional liability attorney with a specific request: "review this AI clause before I make it standard."
That one action starts the clock on being compliant with the AICPA FVS guidelines. Everything else — the confidentiality audit, the output review protocol, the approved tool list — flows from there.
The AICPA told you what the standard is. The firms that act on it now are building a defensible position. The firms that don't are building exposure.
Related Reading
- 88% of Accountants Think AI Is the Most Important Technology in History. Only 8% Are Ready for It. — The AICPA/CIMA readiness gap and how to close it
- AI Disclosure in Engagement Letters: What Every Professional Services Firm Needs to Know — The full framework for updating your engagement letter for AI
- The One-Page AI Use Policy for CPA Firms — A practical starting point for your firm's AI governance
The AICPA FVS guidelines are available to AICPA FVS section members at aicpa-cima.com. NACVA's companion guidance is available to NACVA members. If you're not a member of either organization, the BVResources BVWire summary of the AICPA guidelines is publicly available.
Frequently Asked Questions
What are the AICPA FVS AI guidelines?
The AICPA Forensic & Valuation Services Executive Committee released 'Guidelines for Responsible Use of Artificial Intelligence in Forensic and Valuation Services Engagements' in October 2025. The guidelines are non-authoritative — they are not official standards — but they establish the AICPA's position on responsible AI use in FVS work and are the framework against which a firm's AI use will be evaluated in any dispute or malpractice proceeding. They address engagement terms alignment, confidentiality agreements, non-delegable professional judgment, and output review protocols.
Do I need to disclose AI use in a CPA valuation engagement?
The AICPA FVS guidelines don't mandate universal disclosure, but they require CPAs to verify that AI use aligns with existing engagement terms and confidentiality agreements. If your engagement letter doesn't address AI, you're operating on terms that were never negotiated to include it. The practical recommendation: update your engagement letter to include an AI disclosure and scope clause before using AI tools on client engagements. NACVA's companion guidance takes a similar position.
What is the professional liability risk of using AI in forensic accounting?
In a dispute or malpractice proceeding involving an AI-assisted forensic or valuation engagement, the AICPA FVS guidelines establish the baseline standard of professional conduct. Using AI tools without verifying their alignment with engagement terms, without appropriate confidentiality review, or without documented human professional judgment applied to AI output creates the conditions for a liability finding. The risk is not that AI was used — it's that it was used in a way that the professional didn't control or document.
What should I add to my engagement letter for AI use in valuations?
At minimum, your updated engagement letter should: (1) disclose that AI tools may be used in the engagement, (2) describe the category of tools (e.g., research, document analysis, draft preparation) without necessarily naming specific tools, (3) confirm that AI output will be reviewed and verified by a credentialed professional, and (4) address how client data is handled in AI tools relative to confidentiality obligations. An attorney familiar with your professional liability exposure should review the specific language before you use it.
Does NACVA have AI guidelines for valuation professionals?
Yes. NACVA released 'Follow the Standards When Using AI and Similar Technologies' as companion guidance to the AICPA FVS document. Both documents take the same position: professional standards don't change when AI is involved. The human professional remains responsible for all conclusions, AI output requires independent review, and clients' confidential data must be handled consistently with existing confidentiality obligations. If you use both AICPA and NACVA standards in your FVS practice, you should read both documents.
Get the weekly briefing
AI adoption intelligence for accounting, law, and consulting firms. Free to start.
Related Reading
This is the kind of intelligence premium subscribers get every week.
Deep analysis, cross-sector patterns, and the frameworks that help professional services firms make the crossing.