Watercolor illustration of a figure sitting across from a glowing screen in a dim room, being evaluated by something unseen
AI Transformation·4 min read

The Invisible Interview

Over a billion job applicants received a score. None of them knew it existed. Two lawsuits are about to change what that means.

Share
Copied!

The Brief

Two lawsuits against AI hiring vendors Workday and Eightfold AI are forming a legal pincer that could reshape algorithmic screening. One targets discriminatory outcomes under the ADEA. The other targets secret profiling under the 1970 Fair Credit Reporting Act. Together they expose how AI vendors cap liability while employers bear legal responsibility for algorithmic decisions they cannot audit or understand.


What is the Workday AI bias lawsuit about?
In Mobley v. Workday, plaintiff Derek Mobley applied to over 100 jobs through Workday's platform over seven years and was rejected within minutes each time. A federal judge ruled Workday acted as an agent of employers, making it directly liable for age discrimination under the ADEA.
What did Eightfold AI allegedly do with job applicant data?
A lawsuit alleges Eightfold AI scraped social media profiles, location data, internet activity, and tracking data on over a billion workers, generating Match Scores from zero to five. Low-scoring candidates were filtered out before any human saw their applications, and none were notified.
How does the Fair Credit Reporting Act apply to AI hiring?
The 1970 FCRA requires consumer reporting agencies to disclose when they compile reports used for consequential decisions. Plaintiffs argue that AI Match Scores function exactly like credit scores, compiled from third-party data and shared without the subject's knowledge, triggering FCRA obligations.
Who is liable when AI hiring tools discriminate?
According to the National Law Review, 88 percent of AI vendors cap their liability at the monthly subscription fee, and only 17 percent warrant regulatory compliance. Employers are legally responsible for outcomes generated by data they cannot audit, processed through logic they cannot understand.

You've applied for a job. Maybe dozens. You polished the resume, tailored the cover letter, hit submit. Then... nothing. No interview. No rejection letter explaining why. Just silence, repeated across weeks and applications until you stop counting.

Derek Mobley stopped counting at a hundred. Over seven years, Mobley applied to more than a hundred positions at companies using Workday's hiring platform. Rejected within minutes each time.12 Not by a person. By a score they never saw, generated by a system they didn't know existed.

Mobley's not alone. I've been reading about a separate case that stopped me. A lawsuit alleges that Eightfold AI scraped social media profiles, location data, internet activity, and tracking data on over a billion workers.1 The system generated "Match Scores," ranking each person zero to five. Candidates who scored low were filtered out before any human reviewed their application. None of them were told.

Over a billion job applicants received a score. None of them knew it existed.

A watercolor of a document with a large number on it, with many similar cards fading into the background A billion scores. Zero notifications.

The thing is, the lawsuit against Eightfold doesn't even argue the algorithm was biased. It argues the algorithm was secret.1 Former EEOC chair Jenny Yang brought the case under the Fair Credit Reporting Act, a 1970 consumer protection law written for credit bureaus. Because Match Scores function exactly like credit scores. Compiled from third-party data. Used to make consequential decisions. Shared without the subject's knowledge. The law that may reshape algorithmic hiring was written when hiring meant shaking hands and looking someone in the eye. Fifty-six years later, the shoe fits.

And this is what keeps nagging at me. A recruiter posts a job looking for an "energetic team player." Innocent enough. But the algorithm needs a proxy for "energetic." It finds age. It finds social media activity. It finds things the recruiter never asked about and wouldn't have considered. Nobody instructed the AI to discriminate. Nobody had to. The system found the pattern on its own, buried it in a score, and moved on.

Human bias is retail. One interviewer has a bad day, one candidate gets overlooked. Algorithmic bias is wholesale.1 When the filter is wrong, it's wrong for everyone who passes through it, and nobody on either side of the process can see it happening.

A watercolor of a person standing before a closed door, unable to see what's on the other side The rejection arrived in minutes. The reason never arrived at all.

The legal pincer is closing from both sides. In Mobley v. Workday, Judge Rita Lin ruled that Workday acted as an "agent" of the employers using its screening tools, making it directly liable for discrimination under the Age Discrimination in Employment Act.31 In the Eightfold case, the theory is different: the vendor is a "consumer reporting agency" subject to transparency mandates.1 One attacks outcomes. The other attacks process. Both point the same direction.

Nobody's talking about the vendor contracts. Eighty-eight percent of AI vendors cap what they'll pay if something goes wrong, often at one month's subscription fee.1 The employer can't control the outcomes. Can't see the data. And nobody in the process understands the logic. The lawsuit names the employer.

In 1972, you could walk into a credit bureau and ask to see your file. In 2026, Erin Kistler, one of the Eightfold plaintiffs, just wants to know why a billion-dollar algorithm keeps saying no.1 "I think I deserve to know what's being collected about me and shared with employers," Erin said, as quoted in the National Law Review. "And they're not giving me any feedback, so I can't address the issues."

I keep coming back to that. Erin's not asking for much. Just the thing your grandmother could do at the credit bureau in 1972. The technology leapt fifty-six years forward. The transparency went backward.


References

Footnotes

  1. National Law Review (2026). "AI Hiring Under Fire: What the Eightfold Lawsuit Means for Every Employer Using Algorithmic Screening." National Law Review 2 3 4 5 6 7 8

  2. Golden, R. (2026). "Workday takes partial loss as judge refuses to dismiss claims in AI bias lawsuit." HR Dive

  3. Zheliabovskii, R. (2026). "Judge Allows Key Claims to Proceed in Workday AI Bias Lawsuit." SHRM

Found this useful? Share it with others.

Share
Copied!

Browse the Archive

Explore all articles by date, filter by category, or search for specific topics.

Open Field Journal