3 in 5 Professional Services Firms Have Abandoned an AI Project — Here's What the Ones That Didn't Do Differently

March 31, 20266 min readBy The Crossing Report

You tried it. Someone on your team used it for a few weeks. Then it quietly stopped being part of how your firm works. If this sounds familiar, you're in the majority.

A General Assembly survey of professional services firms found that 61% have abandoned at least one AI initiative in the past year due to a lack of internal skills. Not a lack of budget. Not a bad product choice. A skills gap inside the firm.

35% abandoned multiple projects.

Get the full picture. Go premium.

Weekly intelligence briefings, deeper analysis, and direct access to the full archive.

This isn't a failure story about a few early adopters who moved too fast. This is the majority experience of professional services firm owners in 2026. Most firms that have tried to implement AI have failed at it at least once. The ones telling you AI is transforming their practice are the outliers, not the norm.

But there's a pattern in the data worth looking at closely: the firms that didn't abandon their AI projects did something structurally different from the ones that did. It's not about which tools they bought or how much they spent. It's about how they ran the rollout.

Why AI Projects Die in Professional Services Firms

The General Assembly survey identifies three leading causes, and they don't involve the AI tools themselves.

The skills gap is the real blocker. When a firm installs an AI tool and points staff toward it, the tool competes with existing workflows for time and attention — and it always loses. Staff who haven't been trained on AI tools don't trust the outputs. They check everything manually, which means the AI doesn't save time, it adds steps. After a few weeks of this, the tool gets classified as "extra work" and stops being used.

There's no single owner. The firms that abandon AI projects typically treat adoption as an IT initiative: buy the tool, install it, send an email, and expect usage to follow. The firms that stick with AI assign one person — not a committee, one person — who becomes the internal expert on a specific workflow. That person builds fluency, fixes the prompting, trains others, and owns the result. Without ownership, AI tools drift.

There's no success metric. "We're going to try AI" is not a measurable goal. "We're going to reduce first-draft time for engagement letters from 45 minutes to 15 minutes within 60 days" is. Firms that can't answer "how will we know if this worked?" at the start of an AI initiative almost always abandon it when usage falls off — because there's no feedback loop to tell them whether the dip in usage means the experiment failed or just needs more time.

The Wedge Workflow: How the Firms That Stick Do It

The firms with the lowest AI abandonment rates share a framework that the data describes, even if they don't call it this. I'll call it the wedge workflow.

One workflow. One person. 60 days. One metric.

Pick a single, specific workflow that happens regularly in your firm. Not "all client communication." Not "research." Something concrete: first-draft engagement letters. Call summary notes. Invoice dispute responses. The narrower, the better.

Assign one person to own it. This isn't their full-time job — it's an addition to their existing work. But they're the one who learns the tool, builds the prompts, iterates on the outputs, and becomes the internal expert. Not a vendor rep. Not an IT person. Someone who already does this work.

Run it for 60 days without expanding. The temptation, once it's working, is to immediately expand to five other workflows. Resist it. 60 days gives you time to refine the approach, build staff confidence, and gather real data on what's working.

Measure one thing. Define your success metric at the start. Time per task. Error rate. Volume handled per staff member. Pick one thing that will tell you whether this is working, and track it.

At day 60, you'll have either a proof of concept worth expanding or a clear signal to try a different workflow. Neither outcome is failure. The failure is running AI as a vague experiment with no endpoint.

The Three Workflows With the Highest Stick Rate

Across professional services firms — accounting, law, consulting, staffing — certain workflows consistently show up as high-stick AI adoptions. These are the best starting points for a wedge workflow approach.

Standard document first drafts. Engagement letters, standard contract clauses, proposal templates, client intake questionnaires. These are high-volume, formulaic documents where staff already knows what good looks like. AI output is immediately reviewable. Time savings are visible in the first week. The review step is built into the workflow, so AI errors get caught without changing existing QA procedures.

Meeting and call summaries. Client meeting notes, internal debrief summaries, call action item lists. Staff records the meeting (with client consent), AI produces the summary and action items, staff reviews and sends. Time per meeting drops significantly. The output is easy to verify. And the habit of recording and reviewing is easy to build.

Client communication drafts. First drafts of routine client emails — status updates, document request follow-ups, appointment reminders, billing inquiries. Staff reviews and edits before sending. The AI handles the typing; the professional handles the judgment. For a firm sending 50+ client emails per week, this compounds quickly.

These three workflow categories appear repeatedly in the adoption data because they share a structural property: the human review step is unavoidable, which means staff can't skip the quality check, which means they can't get burned by an error they didn't catch. That safety net makes the tool easier to trust.

What This Means If You've Already Failed Once

If you've tried an AI tool and it didn't stick, you're not behind. You're in the majority.

The firms that are successfully using AI right now didn't buy a tool and have it work out of the box. They failed at a broad rollout, then narrowed to a single workflow. They assigned one person to own it instead of asking everyone to figure it out. They measured something specific instead of waiting to see if people felt better about AI.

You're not missing a technology insight. You're missing a rollout structure.

The skills gap the General Assembly survey identified isn't a gap in AI knowledge — it's a gap in how firms are approaching AI as an organizational change rather than a software purchase. The tools are good enough. The workflows are ready. The firm needs a different implementation model.

Pick one workflow. Assign one person. Run it for 60 days. Measure one thing.

That's the whole framework. Most of the firms that abandoned their AI projects never got this specific.


The Crossing Report covers AI adoption for owners of professional services firms. Subscribe to get the weekly intelligence briefing — concrete, specific, and built for firms with 5 to 50 people.

Frequently Asked Questions

Why do AI projects fail in professional services firms?

According to a General Assembly survey of professional services firms, the top reason is an internal skills gap — not cost, not tooling, and not vendor support. 61% of professional services firms abandoned at least one AI initiative in the past year because staff didn't have the knowledge to use the tool effectively. When a firm buys an AI tool and expects staff to figure it out, the tool gets used for two weeks and then quietly abandoned. The firms that succeed treat AI adoption as a training initiative, not a software purchase.

How many professional services firms have given up on AI?

A General Assembly survey found that 61% of professional services firms have abandoned at least one AI initiative in the past year due to a lack of internal skills. More concerning: 35% abandoned multiple projects. This isn't a fringe experience — it's the majority outcome. Most small and mid-sized professional services firms that have tried to implement AI have failed at it at least once.

What is the wedge workflow approach to AI adoption?

The wedge workflow approach is how firms that stick with AI actually make it work: pick one specific workflow, assign one person to become the internal expert on it, run it consistently for 60 days, and measure a concrete result before expanding. The key is constraint — not 'let's try AI for everything' but 'let's own this one thing first.' Firms that spread AI adoption across the whole team at once rarely see it stick. Firms that go deep on one workflow first consistently do.

What's the difference between firms that make AI work and firms that abandon it?

The research points to three differences: (1) Scope — successful firms started with one specific workflow, not firm-wide adoption. (2) Ownership — they assigned one person to become the internal expert, not a vendor or IT department. (3) Measurement — they defined what success looked like before they started (time saved, error rate, output volume) rather than running it as a vague experiment. Firms that abandoned AI projects typically had the opposite: broad rollouts, no single owner, and no success metric except a vague sense of whether people were using it.

Which AI workflows have the lowest abandonment rate in professional services?

The workflows with the highest adoption persistence in professional services firms are the ones where the output is immediately verifiable and the time savings are obvious within a week. First-draft generation for standard documents (engagement letters, standard contract clauses, proposal templates) is consistently cited as a high-stick workflow — the staff member can review the output against something they already know, which makes errors easy to catch and the time savings immediate. Meeting summaries and client communication drafts are similarly high-stick because the review step is built in. The workflows that tend to fail are the ones where the output is hard to verify — analysis, research synthesis, or anything where the staff member doesn't already know what 'right' looks like.

Get the weekly briefing

AI adoption intelligence for accounting, law, and consulting firms. Free to start.

Free weekly digest. No spam. Unsubscribe anytime.

Related Reading