Before You Use AI on a Client Matter, Check This in Your Malpractice Policy
Published March 14, 2026 · By The Crossing Report
Here's the scenario no one is talking about in the AI newsletters: your firm uses an AI tool on a client matter. The AI produces an error — a hallucinated citation, a miscalculated figure, a contract clause that doesn't mean what you think it means. That error reaches the client. The client files a malpractice claim.
Your coverage may not apply.
The ABA Journal raised this in 2025: professional liability policies increasingly contain AI exclusions, often embedded in standard policy language without explicit notification to policyholders at renewal. JDSupra confirmed in March 2026 that the trend is accelerating — insurers are adding AI exclusion clauses as standard language in professional services E&O renewals.
Most firm owners who are actively using AI tools on client work don't know this gap exists. They will find out when they need to file a claim.
Summary
Many professional liability and malpractice insurance policies now contain explicit AI exclusions — clauses that remove coverage for claims arising from AI-assisted work or AI-generated outputs. The exclusions are appearing in standard E&O renewals, often without explicit notification. Law and accounting firms using AI tools on client matters face potential coverage gaps that existing policies won't address. The three-question conversation with your broker — ideally before your next renewal — is the most time-sensitive action in this piece.
Where the Exposure Is Highest
Three categories of AI error carry the highest professional liability exposure for law and accounting firms:
Citation hallucinations. Federal appellate courts and state appellate courts have now both sanctioned attorneys for AI-generated citations that don't exist. The 4th Circuit issued a public reprimand for unverified AI filings in March 2026. A California state court issued its first appellate-level AI sanctions opinion the same month. The malpractice exposure is direct: a brief citing a non-existent case, filed on behalf of a client, by an attorney who didn't verify it.
Tax calculation errors. AI-assisted tax preparation tools can produce incorrect figures, particularly in edge cases involving unusual deductions, estate situations, or multi-state filings. If a return your firm signs contains an AI-generated error that leads to client penalties, the professional liability exposure is clear. The question is whether "I used an AI tool that produced the wrong number" is a defense your current policy covers — or whether it's precisely the scenario the AI exclusion clause was written to exclude.
Contractual drafting errors. AI drafting assistants can produce clause language that contains substantive legal errors — incorrect representations, missing carveouts, unenforceable provisions. If a client suffers a financial harm because of AI-generated contract language your firm delivered, the malpractice analysis starts with your current policy language.
The Market Is Recognizing the Gap
Munich Re now offers an explicit AI coverage rider for law firms — one of the first major insurers to create a product that fills the AI exclusion gap in standard professional liability policies.
That product exists because Munich Re identified a market need. The need is the gap.
When a major global reinsurer develops a new product to address a specific exposure, it's not a coincidence — it's a market signal that the exposure is real, that standard policies don't adequately cover it, and that clients are starting to ask about it.
Your current broker may not have brought this to you. That's not unusual — brokers typically review your coverage at renewal, not between renewals when your underlying risk profile changes. Your AI tool adoption changed your risk profile. The conversation needs to happen proactively.
Three Questions to Ask Your Broker Before Your Next Renewal
You don't need to understand insurance law to have this conversation. You need three questions:
1. Does my current policy explicitly exclude AI-assisted work or AI-generated outputs?
Ask your broker to pull the exclusion section of your policy and read through it with this question specifically in mind. AI exclusion language typically appears under technology errors, automation exclusions, or product exclusions — not under a headline that says "AI exclusion." It may take a few minutes to identify. This is the question to start with.
2. If an AI tool I use produces an error that reaches a client, is the resulting malpractice claim covered?
This is a hypothetical scenario question. Your broker should be able to answer it by reference to your specific policy language. If the answer is "I'm not sure," that's your answer.
3. Is there a rider or endorsement that extends coverage to AI-assisted professional work?
If your policy contains exclusions, this is the follow-up. Munich Re's product for law firms is one option. Your broker should be able to identify what's available in the professional liability market for your specific practice type and size.
Don't wait until renewal to have this conversation. If you're using AI tools on client work today — and most firms are — the gap is already present. The conversation should happen before it matters.
The Internal Practice Argument
Fixing the coverage gap is one move. Building the internal practice that protects you regardless of coverage is the other.
For both law firms and accounting firms, a minimum viable AI oversight policy has three components:
Verification requirement. Every AI-generated citation, calculation, or factual claim verified against a primary source before it reaches a client or a court. This is a practice requirement, not just a coverage requirement. Courts have been explicit: attorney oversight of AI output is a professional responsibility obligation, not just best practice.
Review requirement. No AI-assisted deliverable reaches a client without attorney or CPA sign-off. This is the human review step that transforms "AI did this" into "professional delivered this." It's also the most credible argument against a malpractice claim if something does go wrong — you can show the review occurred.
Documentation. A record of which AI tools were used on which matters, for what tasks, and who reviewed the output. For law firms, this is also increasingly relevant for court disclosure requirements — federal courts and several state bars now require disclosure of AI use in filed documents. For accounting firms, it's the audit trail that demonstrates professional oversight of AI-assisted work.
These three practices don't eliminate the coverage gap. But they substantially change the posture of any claim that does arise — from "the AI made a mistake and I didn't catch it" to "I exercised professional oversight, documented it, and reviewed every deliverable before delivery."
The coverage question and the internal practice question are both worth answering. The coverage conversation with your broker takes a phone call. The internal practice takes an afternoon to document. Both should happen this month.
Related: AI Liability Is Now an Insurance Question — Here's What Your Carrier Is About to Start Asking | Your Client Used AI to Prepare for Your Meeting — And Now It's Not Privileged | Your Firm Is in the 83%. Here's the Governance Framework to Get Into the 25%.
Sources: ABA Journal, "Does Your Professional Liability Insurance Cover AI Mistakes? Don't Be So Sure" (2025); JDSupra, "AI Update: The Growing Trend of AI-Related Insurance Policy Exclusions" (March 2026); Wiley Law, "2026 State AI Bills That Could Expand Liability and Insurance Risk" (2026).
Frequently Asked Questions
Do malpractice insurance policies cover AI-related errors?
Many do not. The ABA Journal flagged in 2025 that professional liability policies increasingly contain broad AI exclusions, often embedded in standard policy language without explicit notification to policyholders at renewal. JDSupra reported in March 2026 that the trend is accelerating: insurers are adding AI exclusion language as a standard clause in professional services E&O renewals. The specific exposure: if an AI tool produces a hallucinated case citation, incorrect tax calculation, or flawed contract clause — and that error reaches a client and leads to a malpractice claim — the resulting claim may fall outside your current coverage.
What AI errors are most likely to trigger a malpractice claim?
Three categories of AI errors have the highest professional liability exposure in law and accounting firms: (1) Citation hallucinations — AI generating case citations that don't exist, which attorneys rely on without verification in court filings. Federal appellate courts and state courts have sanctioned attorneys for this in 2025 and 2026. (2) Tax calculation errors — AI-assisted preparation producing incorrect figures on returns that the firm signs. If the client faces penalties, the malpractice exposure is direct. (3) Contractual drafting errors — AI generating clause language that contains substantive legal errors affecting client rights or obligations. Each of these can produce a client harm that triggers a malpractice claim. The question is whether your current policy covers it — and many don't.
What AI insurance coverage should professional services firms ask their broker about?
Ask your broker three specific questions at your next renewal: (1) Does my current policy explicitly exclude AI-assisted work or AI-generated outputs? (2) If an AI tool I use produces an error that reaches a client, is the resulting malpractice claim covered? (3) Is there a rider or endorsement that extends coverage to AI-assisted professional work? Munich Re now offers an explicit AI coverage rider for law firms — one of the first insurers to have done so. That product's existence is a market signal: carriers are recognizing a gap that standard policies don't address. Your broker should be able to identify equivalent options in the professional liability market for your practice type.
What should a law firm's AI liability policy include?
For a law firm, a minimum viable AI liability policy has three components: (1) A citation verification requirement — every AI-generated case reference verified against a primary legal database (Westlaw, LexisNexis) before filing. Courts at the federal appellate and state appellate level are now sanctioning attorneys who don't do this. (2) A human review requirement for all client-facing AI-assisted work — no AI-generated deliverable reaches a client or a court without attorney review and sign-off. This is both the ethical requirement under ABA Formal Opinion 512 and the best argument against a malpractice claim if something goes wrong. (3) An internal logging record — documenting which AI tools were used on which matters, for what tasks, and who reviewed the output. If a malpractice claim is ever filed, this documentation is what shows the firm exercised appropriate professional oversight.
Does this apply to accounting firms as well as law firms?
Yes. Professional liability insurance for accounting firms carries the same AI exclusion risk. If your firm uses AI-assisted tax preparation tools, client advisory AI, or AI for audit support — and an error in that AI output leads to a client filing penalty, financial loss, or audit failure — your current professional liability policy may not cover the resulting claim. The same three questions to ask your broker apply. The internal documentation requirement also applies: for an accounting firm, that means logging which AI tools were used on which client engagements, and documenting the CPA's review of AI-assisted work before client delivery.