Why the Eightfold lawsuit matters and doesn't
I’ve been waiting for this. It was just a matter of time before some attorney wondered whether gathering lots of data on people and using it for important decisions like hiring violates the Fair Credit Reporting Act.
This week some job candidates sued Eightfold AI for collecting data about them without telling them, without allowing them to know what’s being collected, and without giving them an opportunity to correct any wrong information.
What does the FCRA cover?
The FCRA applies not just to traditional credit reports, but also to pretty much all kinds of data collection about a person for employment decisions because the data collection falls under the definition of “consumer report.”
A consumer report is:
“any written, oral, or other communication of any information by a consumer reporting agency bearing on a consumer’s credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living which is used or expected to be used or collected in whole or in part for the purpose of serving as a factor in establishing the consumer’s eligibility for
(A) credit or insurance to be used primarily for personal, family, or household purposes;
(B) employment purposes; or
(C) any other purpose authorized under section 604 [§ 1681b].”
A consumer reporting agency is:
“any person which, for monetary fees, dues, or on a cooperative nonprofit basis, regularly engages in whole or in part in the practice of assembling or evaluating consumer credit information or other information on consumers for the purpose of furnishing consumer reports to third parties, and which uses any means or facility of interstate commerce for the purpose of preparing or furnishing consumer reports.”
Employment purposes is:
“a report used for the purpose of evaluating a consumer for employment, promotion, reassignment or retention as an employee.”
Basically, the FCRA applies any time somebody gathers data to be used in evaluating a candidate or employee for any employment decision and provides that data to an employer who pays them for it.
It requires notice that the data is being collected and provided to the employer and the person has to consent to the use of that information in the employment decision.
This seems big until you realize that there is absolutely nothing in these laws that says that employers can’t rely on whatever data they get to make decisions. Even if they decide based on incomplete or incorrect information, they don’t have to change the decision. They just have to give the person the opportunity to correct the information, which won’t matter at that point anyway because the person already didn’t get the job. And even if the person asks for correction, there’s no incentive for anyone to actually do anything because the decision is over and the employer, who has the obligation, doesn’t actually control the data.
How does the CCPA apply here?
Enter the CCPA, the California Consumer Protection Act, that also provides people with rights to know what data is being collected about them as well as how it is used and shared. The CCPA also provides people with rights to correct or have information about them deleted. But except in limited circumstances—primarily at the point the data is collected directly from someone—there is no obligation for the entities collecting the data to disclose that they have it. The obligations are primarily on the consumer to ask, even though they have no way to know if it’s worth asking. Then the burden is all on the consumer to try request correction or deletion.
From a candidate or consumer standpoint, these laws are mostly data privacy theater because they don’t prevent the bad things from happening. They just offer “rights” without remedies that actually do anything that matters.
What are the Plaintiffs asking for in this lawsuit?
The FCRA allows plaintiffs to collect damages for violations of the act. Statutory damages range from $100-$1000 per violation. To the extent that a job candidate can prove actual damages as a result of the violation, a plaintiff could also recover those along with potential punitive damages.
Similarly, the CCPA provides for statutory fines of up to $2500 per violation for unintentional violations and$7,500 per violation for intentional (knowing) violations. But consumers can only recover up to $750 per violation.
The problem is that proving causation, that you did not get a job as a result of an FCRA or a CCPA violation, is almost impossible. If only you had received notice and an opportunity to correct information after the decision was made, then surely you would have been hired in the first place? Yeah, no. Candidate lawsuits are difficult to prove because there are a million perfectly reasonable reasons a candidate may not be selected, including that there were a bunch of other candidates that were closer to what the employer was looking for.
So this is really about the class action and making the class as big as possible then piling on the statutory damages and attorneys’ fees.
Like most class actions, it’s about making the problem known, trying to influence how tech companies and employers operate, and collecting attorneys’ fees for the trouble.
But that’s also what civil litigation does. It moves money around after the bad things happen. The laws exist to try to prevent the bad things from happening by providing a deterrent for violating the requirements. But until there’s an actual lawsuit where someone may have to write a really big check, it’s all a theoretical assessment of the risk of getting caught.
For many tech startups, that’s a risk you deal with later after the product is launched and people are already using it. What could possibly go wrong?
Risk, compliance and culture
Risk and compliance are also culture issues. In some places, compliance is a given because the social contract and practical priorities are to reduce friction, make it easy to understand what is expected, and have reality conform to people’s expectations as much as possible.
This is why in Japan, the trains always run on time and compliance is generally baked into everything.
In the US, risk of getting caught and whether violating the law is more profitable than not is a business decision. After all, it just comes down to money, hassle, and luck. And there are plenty of people willing to roll the dice.
What will happen?
Some tech companies will see this lawsuit and decide that maybe the risk is bigger than they thought and start figuring out how to put the right stand-alone notice and consent forms into every job application ever. It’s not that difficult; the forms and processes already exist.
In this case, Eightfold will fight the class certification because that’s where all the money is. If they lose, they will settle. If they win, they will settle for a whole lot less. But since the fix is easy going forward, then the strategy is to avoid public trials, bad decisions, and even worse, bad precedent on appeal. This is a one-off that will hang around for a while and then disappear. There will be some others similar lawsuits riding on the coat tails in the short term. Still as a practical matter, this issue is going away because providing the notice and consent forms is not that big a deal.
For HR, the risk is bigger because once discovery opens and they have to produce documents related to every time they used a tech program in employment decisions, the costs, resources, and hassles are significant. Even though the potential employers are not being sued in the Eightfold case, they’re the ones with all the evidence. So they will have to hire staff and attorneys to deal with the subpoenas and records.
This is enough to make buyers cautious. And they’re already concerned about AI.
This is also a boon to legacy HR tech companies who have far more at stake if their products cause stress and liability to their customers. They tend to play it safe, bake compliance into everything they do, and not take on legal risk like it’s the Wild West. They are the adults in the room, playing the long game according to the rules. They will also likely be around tomorrow and a few years from now if you plan on actually using the products for a while—like maybe the whole contract term. (I’m a little fed up with companies that don’t really care about doing business and being companies. See also private equity.)
As for lawmakers, notice and an opportunity to maybe fix stuff after it’s too late is not enough. At the same time, making rules that give employers enough flexibility to run their companies while giving applicants and employees meaningful protections that are fair and have real consequences is difficult. But necessary.
We’re seeing lots of experimenting with notice and consent. We’re also seeing requirements for data and analysis and reporting, which I’m generally a fan of because it forces employers and tech companies to see the problems and actually do something to fix them.
In the meantime, we should all be more focused on transparency, trust, and integrity. Especially integrity.



