What's Up with AI and HR Tech Litigation?
by Heather Bussing
What’s up with AI and HR Tech Litigation?
I’m giving a talk called “A frolic through the issues of 21st century employment law” at the RLX Summit for TA leaders. (Someone more clever than I came up with the title. Frolic is such a great word.)
Here’s what might have been on slides but isn’t because I hate them and nobody wants to look at a large screen first thing in the morning. PowerPoint is a terrible way to start the day and anti-frolic.
AI and Hiring Tech Litigation Cheat Sheet
Mobley v. Workday (Filed 2024 in federal court for the N. Dist. of California)
Issue: Whether Workday is liable for discrimination because its software rejected an applicant in the middle of the night, presumably without any human every reviewing the application, and sorting function is alleged to be biased based on age and race.
Current status: The plaintiff is trying to convert this to a class action and the parties are still arguing over what the viable claims in the Complaint should be. Workday’s last challenge to the complaint resulted in the court giving Plaintiff another shot at amending the Complaint.
The court has already held that Workday was properly alleged to be an “agent” of the employer under Title VII and the Plaintiff can allege liability on that basis. Title VII generally allows applicants to sue any party who makes hiring decisions, including staffing agencies, third party recruiters, and now potentially tech companies.
We don’t know enough facts to have any sense about whether it will work or not. For example, we don’t know what the settings were on the software, who set them, what testing was done by Workday, whether they had any idea that the outcomes of any auto-sort of resumes were biased, what testing and mitigation was done, or why the Plaintiff was rejected.
We also don’t know what the contract between Workday and the employer says about who is liable if one of them is sued.
In a recent piece I wasn’t sure why Workday was making some less than great arguments about disparate impact and age claims. Since then, I’ve decided it’s because the number and types of claims will inform who would be in the class for any class action and it’s in Workday’s interest to limit the class as much as possible.
This case is expected to drag on for a long time and I’ll have a better analysis of what will happen when I actually understand what the evidence is. Pesky facts.
Kistler v. Eightfold AI (Filed Jan 2026 in California state court for Contra Costa County)
This is an applicant lawsuit against Eightfold AI claiming that it violated the Fair Credit Reporting Act by gathering background and other data about candidates and providing it to employers for use in hiring.
Eightfold and a whole bunch of other HR Tech companies that gather information about people then sell it to employers are probably (really likely) to be subject to the FCRA, which requires them to give notice to the people that they are gathering information about, the subjects’ content to provide it to employers, and an opportunity to correct incomplete or inaccurate information. That didn’t happen.
Of course it didn’t happen, the tech that grabs this information off the internet has no interest in who the people are and everyone assumed that because the information was publicly available it was no big deal. But the FCRA doesn’t really care where you got the information. If you collect it and sell it to employers for use in employment decisions, you are probably on the hook.
The trouble is that the hook is puny. The remedies under the FCRA are mostly statutory fines that go to the government, not the people claiming harm. Plaintiffs can recover $100-$1000 per violation depending on the number of violations and whether they were intentional.
But for one violation, the damages are $100. So this all depends on a class action too, which should not be that hard.
The other problem is that the applicants still have to prove that they didn’t get the job because of the information or ranking provided to the employer by Eightfold AI. There are approximately eleventy-seven gazillion reasons that are mostly perfectly legal why people don’t get jobs. But it will be a complete pain in the ass for the employers who have all the data and information and will get dragged into discovery.
The other thing is that meeting the requirements of notice and consent under the FCRA are not that hard and can be automated.
So I don’t see this one as a big deal, except for employers who don’t want to have to deal with records subpoenas.
D.K. v. Intuit and Hire Vue (Filed 2024 with the Colorado Civil Rights Division)
This one is bad.
An internal candidate at Intuit was applying for a promotion and went through a Hire Vue automated interview process and was rejected. There was no question the candidate was qualified and had a great performance record at Intuit.
But the Hire Vue feedback was that the candidate needed work in providing clear and direct answers, had effective communications issues, and should focus on “active listening.”
The candidate was deaf.
She communicated with both sign language and speech, but the transcripts of the interview were a mess because the system was not designed to accommodate candidates with speech or hearing disabilities.
This one will not end well. But because it is being handled by an administrative agency, we don’t have a lot of visibility into the process or status. I’m sure it will settle.
The upshot of these cases is that some tech can cause discrimination and tech companies may be liable but it’s hard to say at this point because proving causation is going to be hard no matter what.
If you are an employer:
Make sure the tech you use is designed to accommodate people with disabilities and tested for bias (all sorts) before you implement it.
Read your tech contracts and know what happens if either of you gets sued.
Check your insurance policies and make sure they cover discrimination claims.
Monitor your outcomes and if you see problems, investigate and fix if possible.
Keep records on why candidates were rejected; it can be a drop down check the box reason. But have a reason.
Everything is on the record now. So know what’s going on and be prepared to explain it.
Look around your work site and see if there are people of various colors, genders, abilities, and whether you have a variety pack of managers and leaders. If not, there’s a glitch in your hiring and promotion processes.
The bottom line is don’t discriminate. And don’t let your tech discriminate.



