Your Firm's AI Conversations Aren't Private — A Federal Court Just Clarified Why
Published March 16, 2026 · By The Crossing Report
Your Firm's AI Conversations Aren't Private — A Federal Court Just Clarified Why
A tax preparer at a small CPA firm runs a client's Schedule K-1 through personal Claude.ai to check a pass-through deduction calculation. It saves 20 minutes. She closes the window. Nobody logs it, nobody thinks about it again. Then the client gets selected for an IRS examination. Opposing counsel files a discovery request. The question arrives: what tools did your firm use while preparing this return?
Most firm owners have not thought through the answer to that question. The Heppner ruling — issued February 17, 2026 by a federal judge in the Southern District of New York — gives them a reason to start.
What the Ruling Actually Said
United States v. Heppner (SDNY, 25-cr-00503-JSR) involved a defendant who used Anthropic's consumer Claude.ai to create documents he intended for discussions with his attorneys. When prosecutors sought those documents, Heppner argued attorney-client privilege. Judge Jed Rakoff rejected that argument.
The court's reasoning turned on two points:
First, confidentiality. Attorney-client privilege requires that the communication be confidential. Using a consumer AI platform means voluntarily sharing information with a third party — Anthropic, in this case. Consumer terms of service allow the platform to process your inputs. That act of sharing with a third party breaks the confidentiality requirement. No confidentiality, no privilege.
Second, attorney direction. Work product protection requires that materials be prepared by or at the direction of an attorney, in anticipation of litigation. Heppner prepared the documents himself, without attorney oversight, using a public platform. The court found no work product protection applied.
The ruling's legal reasoning has implications that run well beyond the specific defendant. If sharing information with a consumer AI platform breaks confidentiality, the same logic applies to anyone who uses a consumer AI platform to process client information — including attorneys and accountants doing the client work themselves.
Why Morgan Lewis Extended This to Your Accounting Firm
In March 2026, Morgan Lewis published an analysis titled What Heppner Means for Tax Departments. The extension was straightforward: the confidentiality logic does not stop at the attorney's desk.
A CPA preparing a client's return, a consultant building a regulatory response, a tax attorney drafting an amended filing — if any of those professionals process client-related information through a personal AI account, they have voluntarily shared that information with a third party. The confidentiality that protects their work process from discovery is compromised the same way Heppner's was.
For accounting firms specifically, the exposure surfaces in two contexts:
Tax examination and appeal. When a return is examined by the IRS, the firm's work files and communications can be subject to discovery. Any AI-generated materials created through personal accounts could be producible.
Litigation support and advisory work. Accounting firms that provide litigation support, expert witness services, or advisory work with regulatory investigation exposure face the same work product questions that law firms do. The Morgan Lewis analysis treats these as equivalent.
The practical summary: if your firm does anything for a client that could end up in a dispute, your tool stack matters.
The Enterprise vs. Personal AI Distinction
This is the decision point every firm owner needs to understand.
Personal/consumer AI accounts are not appropriate for client work. This includes:
- ChatGPT Free and ChatGPT Plus (personal accounts)
- Claude.ai personal accounts (the free tier at claude.ai)
- Google Gemini consumer tier
- Any AI tool your staff found and started using without a firm subscription and data processing agreement
Enterprise AI accounts are the correct tool for client work. These include:
- Microsoft Copilot (add-on to any Microsoft 365 commercial subscription — $21/user/month)
- Claude for Enterprise (Anthropic's enterprise tier with a DPA)
- ChatGPT Enterprise or ChatGPT Team
- Purpose-built legal and accounting tools: Clio Draft, Harvey, Spellbook, Thomson Reuters CoCounsel, Black Ore Tax Autopilot, Accrual
The critical difference is the data processing agreement (DPA). Enterprise tiers have contractual obligations governing how your data is handled and confirming it is not used for training the model. Consumer tiers do not. That contractual layer is what separates "third party who saw your client's information" from "vendor bound by confidentiality obligations."
If you already pay for Microsoft 365 Business — which most small professional services firms do — Microsoft Copilot is available as a $21/user/month add-on. You already pay for the base subscription. The enterprise data protection is the reason to pay for Copilot rather than using free ChatGPT.
Three Questions to Ask About Every AI Tool in Your Firm
Before your next client engagement, work through this audit for every AI tool your team uses:
1. Who owns the subscription? If staff members are using personal accounts — ChatGPT Plus on their personal credit card, Claude.ai on their personal email — those tools are not covered by any firm-level data processing agreement. The firm has no protection.
2. Does the account have an enterprise or business data processing agreement? Log into the tool's account settings. Look for references to "enterprise," "business plan," "data processing agreement," or "BAA" (business associate agreement for HIPAA contexts). If you can't find it, assume it doesn't exist.
3. Does your engagement letter address AI use? Your engagement letter establishes the legal framework for the client relationship. Post-Heppner, it should address three things: what AI tools your firm uses and how they are governed; that client-created AI documents related to the matter should be discussed with you before being created; and that AI tools your firm deploys under professional supervision are governed by your firm's data practices. One paragraph added to your existing letter gets this done.
The Action You Can Take This Week
If your firm is on Microsoft 365, the path is clear: add Microsoft Copilot to your subscription. You get a data-processing-agreement-protected AI environment inside the tools your team already uses — Word, Outlook, Excel, Teams — for $21/user/month. That is the lowest-friction path to an enterprise-grade AI environment at a firm with 5 to 20 employees.
If you're on Google Workspace, the equivalent is Gemini for Google Workspace, which is included in Business Standard and above plans.
Neither requires an IT project. Neither requires new software training. Both provide the enterprise data protection that personal accounts do not.
The deeper action — auditing what your staff is currently using for client work — takes a conversation, not a software purchase. Ask your team: what AI tools are you using, and are you using personal accounts or firm accounts? The answer will tell you where your exposure is.
A federal court has now established the framework. The question is not whether this applies to professional services firms. Morgan Lewis has answered that question. The question is whether your firm has the right tools in place before it matters.
The Heppner ruling established one end of the risk spectrum. The March 13 Crossing Report special edition covers the other end — what happens when your clients use AI to prepare for meetings with you. Both rulings point in the same direction: the firms that govern AI use carefully are the ones who avoid the exposure.
Related Reading
- AI & Attorney-Client Privilege: The Heppner Ruling — What the ruling means for work product protection, AI tool selection, and client disclosure
The Crossing Report covers AI adoption for professional services firm owners every Monday. Subscribe at crossing.one.
Frequently Asked Questions
Does using ChatGPT for client work break attorney-client privilege?
It depends on which ChatGPT. Personal ChatGPT accounts (ChatGPT Free, ChatGPT Plus) are consumer platforms — your inputs may be used to train OpenAI's models and are not subject to a data processing agreement. Courts applying the Heppner reasoning would treat this as voluntarily sharing information with a third party, which breaks the confidentiality required for attorney-client privilege and work product protection. ChatGPT Enterprise and ChatGPT Team accounts are different — they include data processing agreements and OpenAI does not use those inputs for training. Enterprise accounts are the safe option for client work.
What is US v. Heppner and why does it matter for law and accounting firms?
United States v. Heppner (SDNY, February 17, 2026) is the first federal ruling to establish that documents created using a public AI platform are not protected by attorney-client privilege. The case involved a defendant who used Anthropic's consumer Claude.ai to create documents for discussions with his attorneys. The court ruled that using a consumer AI platform voluntarily shares information with a third party, which destroys the confidentiality that privilege requires. Morgan Lewis's March 2026 analysis extended this reasoning to accounting and tax firms: client financial information processed through personal AI accounts faces the same exposure, particularly for matters with litigation or regulatory investigation exposure.
What AI tools are safe to use for client work at a law or accounting firm?
The determining factor is whether your AI tool has a data processing agreement (DPA) that governs how your data is handled and confirms your data is not used for model training. Enterprise-tier tools that meet this standard include: Microsoft Copilot (M365 commercial subscriptions), Claude for Enterprise (Anthropic's enterprise tier), ChatGPT Enterprise, and purpose-built legal and accounting AI tools like Clio Draft, Harvey, Spellbook, Thomson Reuters CoCounsel, and Black Ore Tax Autopilot. Free personal accounts — ChatGPT Free, ChatGPT Plus, Claude.ai personal accounts, Google Gemini consumer tier — do not have enterprise data protections and should not be used for client-related work.
Can a small accounting firm afford enterprise AI tools?
Yes. Microsoft Copilot is available as an add-on to Microsoft 365 commercial plans at $21/user/month — and most small accounting and law firms already pay for Microsoft 365. Adding Copilot gives you a data-processing-agreement-protected AI environment for the tools your team already uses: Word, Outlook, Excel, and Teams. For firms on Google Workspace, the equivalent is Google Workspace Business with Gemini for Workspace — which includes enterprise data protection. The cost of not upgrading is a potential confidentiality breach; the cost of upgrading is $21/month per user.
Does the Heppner ruling affect accounting firms, not just law firms?
Yes. Morgan Lewis's March 2026 analysis ('What Heppner Means for Tax Departments') explicitly extended the reasoning to accounting and tax firms. The mechanism is the same: if a tax preparer or CPA uses a personal AI account to process client financial data, and that client later becomes involved in an IRS examination, regulatory investigation, or litigation, those AI conversations could be discoverable. The confidentiality protections that normally shield a firm's work process from opposing parties require that the information not be voluntarily shared with a third party — and a personal AI account is a third party.