Two Federal Lawsuits Say Your AI Hiring Tool Makes You a Co-Defendant

April 27, 20265 min readBy The Crossing Report

Two Federal Lawsuits Say Your AI Hiring Tool Makes You a Co-Defendant

If your staffing firm uses AI to screen candidates — and 61% of staffing firms now do — you need to understand what two federal lawsuits decided this spring. The legal theory they establish changes your liability profile regardless of which tool you use.

The defense you assumed you had — "the algorithm decided, not us" — failed in both cases.

The Eightfold AI Class Action

In January 2026, a class action was filed in federal court naming Eightfold AI as a defendant alongside the employers and staffing agencies that used its platform.

The allegations: Eightfold scraped personal data on over one billion workers, scored candidates on a zero-to-five proprietary scale without their consent, and allowed clients to discard low-ranked candidates before any human ever reviewed the application. The suit is brought under the Fair Credit Reporting Act (FCRA) and California's Investigative Consumer Reporting Agencies Act (ICRAA).

The detail that matters for every staffing firm owner: the employers and staffing agencies using Eightfold are named as co-defendants. Not just Eightfold. You.

The theory is straightforward: if you deployed a tool that processed candidate data, made screening decisions, and those decisions violated FCRA or anti-discrimination law — you are in the chain of liability regardless of whether you wrote the code. Eightfold made the scoring model. You used it to decide who got seen by a human and who didn't. (Source: Jones Walker LLP, April 2026)

The Workday Case: A Judge Already Agreed

Mobley v. Workday is further along. A federal judge refused to dismiss the case and granted preliminary collective certification for a class of job applicants whose applications were processed through Workday's AI screening tools.

The plaintiff's theory: Workday's AI screening system trained on historical hiring data that embedded age and disability discrimination. Workers filtered out by the AI at the top of the funnel never got to a human reviewer. The employer's conduct — using an AI tool that produced discriminatory outcomes — constitutes age discrimination under the ADEA.

The court is treating the employer's decision to deploy Workday as an employer decision, not a vendor decision. "We used the tool our HR platform offered" is not a defense when the tool produces discriminatory results at scale. (Source: HR Dive, March 2026; Fisher Phillips)

A final ruling is expected in 2026.

What Joint Liability Means for a Small Staffing Firm

Both cases establish the same principle: when you deploy an AI tool in your hiring pipeline, you assume co-responsibility for its outputs.

This is not a BigLaw or Fortune 500 problem. The Eightfold lawsuit specifically names staffing agencies — not just employers. Staffing firms are in the employment-decision chain by definition. You screen, rank, and recommend candidates for placement. If an AI tool does any part of that work, you have exposure.

The practical question is not whether AI discrimination liability exists — it does. The question is whether your firm has taken the steps that would allow you to defend against a claim.

Most small staffing firms have not. They adopted an AI-assisted ATS or sourcing tool, assumed the vendor's terms covered liability, and moved on. The Eightfold case says that assumption is wrong.

The Five-Step Audit

Run this before a complaint is filed, not after.

Step 1: List every AI-assisted tool in your candidate pipeline. This includes ATS features you may not have consciously enabled — resume scoring, candidate ranking, AI-generated match scores, automated disqualification triggers, chatbot screening, AI-assisted scheduling. Most platforms now bundle AI features that firms turned on without thinking about liability implications. Know what's running.

Step 2: Check your vendor agreements for indemnification. Does your ATS or sourcing platform vendor accept liability if their AI tool produces a discriminatory outcome? Read the terms. Most do not indemnify you. Some explicitly disclaim liability for AI-generated outputs. If you're sharing a courtroom with your vendor, the terms of your service agreement are the first document produced in discovery.

Step 3: Build candidate disclosure language. Illinois HB 3773 (effective January 1, 2026) already requires written disclosure to applicants when AI influences any employment decision. Connecticut SB 5 will require it by October 1, 2026 if signed. But disclosure language also protects you in litigation — it demonstrates that candidates were informed. Add one sentence to your intake documentation: what AI tools may be used, that AI supports but does not replace human judgment, and how candidates can ask questions.

Step 4: Document human review touchpoints. The Eightfold lawsuit specifically targets automated discarding — candidates eliminated by AI before a human reviewed the application. Your best defense is a documented record that a human reviewed candidates before any discard decision was made. If your ATS allows AI to filter candidates to a "hidden" queue that recruiters never see, that is your highest-exposure workflow. Either add a human review step, or understand that the current flow creates Eightfold-like liability.

Step 5: Consult employment counsel on your specific stack. This is a $500–$2,000 conversation with an employment attorney, not a $200,000 class action. Tell them which tools you use, walk them through your candidate pipeline, and ask two questions: (1) Do we have any FCRA obligations related to our AI vendor's data processing? (2) What's our exposure if this tool produces a discriminatory outcome? Get the answer documented. That documentation is also useful if a complaint ever arrives.

The Regulatory Stack Context

These lawsuits don't exist in isolation. Illinois HB 3773 has been in effect since January 1, 2026. Texas TRAIGA requires vendor documentation for AI employment tools. Connecticut SB 5 passed the Senate 32-4 in April 2026 and is one House vote from becoming law, with an October 1, 2026 compliance deadline.

The regulatory and litigation environments are moving in the same direction: staffing firms that use AI in their candidate pipeline have disclosure, documentation, and oversight obligations — and now co-defendant exposure when those obligations aren't met.

The difference between a firm that gets named in a complaint and a firm that doesn't is rarely the AI tool they used. It's whether they built the disclosure language, the vendor audit, and the human review step before the complaint was filed.

The Eightfold class action is in early stages. Workday's ruling is advancing. You have time to run the five steps above. You do not have unlimited time.


Related: Connecticut SB 5: AI Employment Law Compliance for Staffing Firms | AI Hiring Tools and FCRA Compliance for Staffing Firms

Frequently Asked Questions

Can a staffing firm be held liable for discrimination by an AI hiring tool it didn't build?

Yes. Two federal lawsuits advancing in 2026 establish that employers and staffing agencies using AI hiring tools can be held jointly liable alongside the AI vendor for discriminatory outcomes. In the Eightfold AI class action (filed January 2026), the employers and agencies using Eightfold's platform are named as co-defendants alongside Eightfold itself. In Mobley v. Workday, a federal judge allowed age discrimination claims to proceed and granted preliminary collective certification for a class of applicants screened by Workday's AI. The 'the algorithm decided, not us' defense has failed in both cases.

What did Eightfold AI do that triggered the class action lawsuit?

The Eightfold AI class action (filed January 2026, federal court) alleges that Eightfold scraped personal data on over one billion workers, scored candidates on a proprietary zero-to-five scale without their consent, and allowed employers to discard low-ranked candidates before any human reviewed the application. The suit is filed under the Fair Credit Reporting Act (FCRA) and California's Investigative Consumer Reporting Agencies Act (ICRAA). The employers and staffing agencies that used Eightfold's platform are named as co-defendants, not just Eightfold.

What should a staffing firm do immediately if it uses AI hiring or candidate screening tools?

Run a five-step audit: (1) List every AI-assisted tool used in your candidate pipeline — ATS scoring, AI sourcing, resume ranking, chatbot screening, automated scheduling. (2) Review each vendor's terms for indemnification language — does the vendor accept liability if their tool produces a discriminatory outcome? (3) Build candidate disclosure language explaining that AI tools may be used in your screening process. (4) Document human review touchpoints — can you show that a human evaluated the candidate before a discard decision was made? (5) Consult employment counsel on your specific tool stack and state exposure. This is a pre-lawsuit audit, not a post-complaint exercise.

What is the Workday Mobley case and what did the court decide?

Mobley v. Workday is a federal class action in which a plaintiff alleges Workday's AI screening tools systematically disadvantaged older workers and workers with disabilities by training on biased historical hiring data. In early 2026, a federal judge refused to dismiss the ADEA (Age Discrimination in Employment Act) claims and granted preliminary collective certification for a class of workers screened by Workday AI. The case is advancing toward a final ruling expected in 2026. The key legal development: the court is treating the AI's screening decisions as attributable to the employers who deployed Workday — not solely to Workday as vendor.

Does this apply to staffing firms, or just direct employers?

Both. The Eightfold AI lawsuit specifically names staffing agencies and employers using the platform as co-defendants. Staffing firms are in the employment-decision chain by definition — they screen, rank, and recommend candidates for placement. If your firm uses an AI tool at any point in that chain, you are a potential co-defendant in a FCRA or discrimination claim. The fact that you licensed the tool rather than built it is not a shield against liability.

Get the weekly briefing

AI adoption intelligence for accounting, law, and consulting firms. Free to start.

Related Reading

This is the kind of intelligence premium subscribers get every week.

Deep analysis, cross-sector patterns, and the frameworks that help professional services firms make the crossing.