When AI Decides Who Gets Hired: What the FCRA Case Against Eightfold AI Means for Job Applicants

By: The Schlanger Law Group Legal Team 

criminal using fake ID to commit bank account fraud

Imagine applying for a job at a major company. You tailor your resume, write a cover letter, and submit your application through what looks like the company’s own hiring portal. You wait. You never hear back. You assume a recruiter reviewed your materials and moved on.

But that’s not necessarily what happened. Increasingly, before any human being looked at your application, an artificial intelligence tool collected data about you, not just your resume, but information scraped from the internet about your publications, your conference appearances, your job history, and potentially your online browsing activity, your location data, and the cookies on your device. The AI fed that information into a proprietary algorithm, drew inferences about your personality, your aptitudes, and your predicted behavior, compared those results with results relating to other applicants to this and other positions, and then assigned you a numerical score. That score determined whether a human ever looked at your application at all. You never knew any of this was happening, you never consented to it, you never saw your score, and you were never told it was used against you.

This isn’t hypothetical. A class action lawsuit filed in January 2026 alleges this is exactly how Eightfold AI, one of the largest AI hiring platforms in the country, operates, and that it violates a federal consumer protection law that has been on the books since 1970.

AI in Hiring: A Legitimate Tool with a Legal Problem

It’s worth starting with a fair point: the appeal of AI hiring tools to employers is understandable. Finding the best possible employees is a legitimate business goal, and the traditional alternatives are not great. Having a recruiter manually Google every applicant is inefficient, inconsistent, and doesn’t scale. (In addition, there is no reason to think less systematized candidate research is less prone to bias; indeed, it may be more prone.) A systematic, data-driven approach to matching candidates to roles is a reasonable aspiration, and it’s no surprise that employers have moved aggressively in this direction. According to a 2025 Resume.org report, 57% of companies already use AI somewhere in their hiring process. SHRM reported that AI use across HR tasks jumped from 26% of organizations in 2024 to 43% in 2025. Industry forecasts project that by 2026, roughly 80% of large enterprises will be using AI for significant parts of their hiring workflow.

But there’s a disconnect. A Pew Research Center survey found that 71% of Americans oppose AI being used to make final hiring decisions, and 66% said they would not want to apply for a job with an employer that uses AI in the hiring process at all. That skepticism likely reflects something deeper than discomfort with new technology. Most people expect to be evaluated as individuals when they apply for a job. The idea that every cookie you’ve accepted, every link you’ve clicked, etc. is being aggregated into a score that determines whether you even get an interview, and that you might not even be aware that this information is being assessed by your potential employer, runs against our long-standing expectations about how the employment process should work.

The Fair Credit Reporting Act suggests that expectation has the force of law behind it.

What Is Eightfold AI?

Eightfold AI describes itself as a “talent intelligence platform.” According to its own website, it maintains more than 1.6 billion career profiles and maps over 1.6 million skills. More than 100 organizations in 155 countries use the platform, including major employers like Microsoft, PayPal, Starbucks, and Chevron. The company was founded in 2016, reached a $2.1 billion valuation in 2021, and has raised over $410 million in total funding.

In January 2026, two job applicants, Erin Kistler and Sruti Bhaumik, filed a class action complaint against Eightfold in the Northern District of California. The case is Kistler v. Eightfold AI, Inc., Case No. 3:26-cv-00559. The complaint, filed by the plaintiffs’ firm Outten & Golden LLP, alleges that Eightfold operates as an unregistered consumer reporting agency in violation of the FCRA and California’s Investigative Consumer Reporting Agencies Act (ICRAA).

According to the complaint, here is what Eightfold does: When a job applicant submits an application through an employer’s Eightfold-powered portal, the platform collects the applicant’s resume data and then supplements it with information gathered from other sources. The complaint alleges these sources include public data (blogs, publications, conference appearances, job application history) as well as non-public data (location information, internet and device activity, cookies, and tracking data). Eightfold’s own privacy policy states that it draws inferences about applicants’ “preferences, characteristics, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.” The platform then generates a Match Score, a rating on a 0-to-5 scale representing the applicant’s “likelihood of success,” and provides that score to the employer.

The plaintiffs allege they applied for jobs at Microsoft and PayPal through URLs containing “eightfold.ai,” received no disclosure that Eightfold existed or would be evaluating them, gave no written authorization, never saw their Match Scores, and received no adverse action notices when they were not selected. Kistler alleges that only 0.3% of thousands of applications she submitted through Eightfold-powered portals progressed to a follow-up or interview.

What Is the Fair Credit Reporting Act, and Why Does It Apply Here?

The name is misleading. Despite containing the word “credit,” the Fair Credit Reporting Act, enacted by Congress in 1970, does not just cover credit reports. It covers consumer reports, which the statute defines broadly as any communication by a consumer reporting agency bearing on a person’s “character, general reputation, personal characteristics, or mode of living” when used for employment, credit, insurance, or other qualifying purposes.

A consumer reporting agency under the FCRA is any entity that regularly assembles or evaluates consumer information for the purpose of furnishing consumer reports to third parties. This is a functional definition. It doesn’t matter what a company calls itself; what matters is what it does.

The Kistler complaint argues that Eightfold meets this definition: it assembles consumer data (resumes, publicly available information, and allegedly tracked online activity), evaluates that data through a proprietary AI model, generates reports (Match Scores with accompanying assessments), and furnishes those reports to employers for the purpose of making hiring decisions.

One point that deserves emphasis: the FCRA covers compilations of publicly available information. There is no “public records exception” that would exempt Eightfold simply because some of the data it collects is publicly accessible. That said, the complaint’s allegation that Eightfold also collects non-public data (cookies, device tracking, location data, internet activity) makes the consumer reporting agency designation even less of a stretch (and certainly helps make the optics more plaintiff-friendly, as well). There is a spectrum here: collecting someone’s published articles and conference appearances is one thing; tracking their browsing habits and physical location is another. The FCRA arguably covers both, but the non-public data underscores the scope of what applicants allegedly never knew was happening.

What the FCRA Requires for Employment Screening, and What Allegedly Didn’t Happen

The FCRA’s employment provisions are among the most protective in the statute. They impose specific obligations on both the consumer reporting agency and the employer at every stage of the process. Here is what the law requires, contrasted with what the Kistler complaint alleges occurred.

Disclosure and authorization. Before an employer obtains a consumer report for employment purposes, Section 1681b(b)(2) of the FCRA requires a clear, standalone written disclosure to the applicant and written authorization from the applicant. The disclosure must be in a document that consists solely of the disclosure; it cannot be buried in an employment application or bundled with other forms. This standalone disclosure requirement has been a major source of litigation, generating hundreds of millions of dollars in settlements over the past decade against employers who got the form wrong. What the complaint alleges happened here is more fundamental: the applicants received no disclosure at all. They did not know Eightfold existed.

CRA certification. Before furnishing a consumer report for employment purposes, Section 1681b(b)(1) requires the consumer reporting agency to obtain a certification from the employer that the employer will comply with FCRA requirements, specifically that it has provided the required disclosures, obtained authorization, and will follow the adverse action process. The complaint alleges no such certifications were obtained.

Accuracy. Section 1681e(b) requires consumer reporting agencies to follow reasonable procedures to assure “maximum possible accuracy” of the information in consumer reports. Traditional background check companies, even with human review, produce reports containing errors at significant rates. The Consumer Financial Protection Bureau has estimated that one in four background checks contains an error, and employment-related inaccuracies account for nearly 30% of all FCRA complaints. The Kistler complaint raises the question of what “maximum possible accuracy” means when an AI system is drawing inferences about a person’s personality, aptitudes, and predicted future behavior from billions of aggregated data points, and the applicant has no opportunity to review or correct any of that information before it is used.

The adverse action process. When an employer takes adverse action based on a consumer report (including deciding not to hire someone) the FCRA requires a two-step process. First, the employer must send a pre-adverse action notice that includes a copy of the consumer report and a summary of the applicant’s rights under the FCRA. The employer must then wait a reasonable period (industry practice is typically five business days) to allow the applicant to review the report and identify any errors. Only then may the employer send a final adverse action notice confirming the decision. Even if every piece of information in the report is accurate, failing to follow this process is a separate violation. The complaint alleges that none of this happened: no pre-adverse action notice, no copy of the Match Score, no summary of rights, no waiting period, no final adverse action notice. Applicants who were screened out simply never heard back.

Consumer access. Section 1681g gives consumers the right to access the information in their files at a consumer reporting agency. The complaint alleges applicants had no meaningful ability to access or review the reports Eightfold generated about them.

This Isn’t Entirely New Territory

The application of the FCRA’s consumer reporting framework to AI hiring tools is a new issue in the field of FCRA litigation, but there is relevant precedent.

In 2015, a class action was filed in the Northern District of California against a company called TalentBin (later acquired by Monster Worldwide). TalentBin scraped publicly available data from platforms like GitHub and Stack Overflow to create candidate profiles with skill rankings, and then sold those profiles to recruiters and employers. The case, Halvorson v. TalentBin, Inc., Case No. 3:15-cv-05166, alleged FCRA violations: no certifications from employers, no disclosures to candidates, and no consumer access to the assembled profiles. The case settled for $1.15 million, and the settlement required TalentBin to change its practices to comply with the FCRA.

The Kistler complaint presents a more developed version of this theory. Where TalentBin scraped public profiles and ranked observable skills, Eightfold allegedly uses a proprietary deep-learning AI model processing billions of data points, generates predictive scores about personality and future performance, and incorporates non-public tracking data, a significantly broader scope of data collection and inference.

The Consumer Financial Protection Bureau also weighed in on this question before the current administration took office. In October 2024, the CFPB issued Circular 2024-06, which explicitly stated that an entity collecting consumer data to train an algorithm that produces scores or assessments for employers could meet the statutory definition of a consumer reporting agency under the FCRA. That circular was withdrawn in May 2025 as part of a broader rollback of 67 CFPB guidance documents. But it is important to understand what the withdrawal does and does not mean: the CFPB withdrew its guidance; Congress did not amend the statute. The FCRA’s definitions remain exactly as Congress enacted them.

A Familiar Pattern: New Technology, Existing Consumer Protections

Practitioners in consumer protection law may recognize a pattern here. When peer-to-peer payment platforms like Zelle and CashApp emerged, they represented a genuine technological advance, applying new technology to a payment space that had not seen much innovation. The business opportunity was enormous, and many of these platforms initially took the position that the Electronic Fund Transfer Act [HYPERLINK: https://consumerprotection.net/electronic-fund-transfer-act-101/] did not apply to them at all. They characterized themselves as software companies, not financial institutions, and argued that existing consumer protection regulations were designed for a different era and a different set of actors.

That position has largely been abandoned. After years of regulatory guidance and litigation, the P2P payment industry has mostly stopped challenging whether the EFTA covers their platforms. (Some platforms, like Zelle, now concede that the transactions are covered but argue that liability runs to the banks offering the service rather than to the network itself, a different kind of argument, but no longer a blanket denial of coverage.) The EFTA did not prohibit peer-to-peer payments. It required error resolution procedures, authorization requirements, and consumer liability protections. The technology adapted to the legal framework that had always applied.

It is too early to know whether AI hiring tools will follow the same trajectory, but the parallels are hard to miss. There are new actors in the space. These companies are focused on the business opportunity presented by applying AI and big-data capabilities to applicant assessment. Like the early P2P platforms, some may take the position that consumer protection statutes written decades ago were never meant to reach their technology. Lawsuits like Kistler v. Eightfold AI will test that position. If the courts agree that the FCRA applies, these companies will need to comply with the same framework that has governed consumer reporting for more than fifty years: tell applicants what data you are collecting, let them see it, give them a chance to correct errors, and tell them when it is used against them.

The Regulatory Patchwork, and Its Gaps

A handful of state and local jurisdictions have begun addressing AI in hiring directly, but their focus is different from the FCRA’s. New York City’s Local Law 144 requires bias audits and candidate notification when automated decision tools are used (though a December 2025 audit by the New York State Comptroller found the city’s enforcement system “ineffective,” with at least 17 instances of potential noncompliance missed among 32 companies reviewed). Colorado’s AI Act, effective February 2026, requires risk management policies and annual impact assessments for high-risk AI systems. Illinois has amended its Human Rights Act to prohibit discriminatory AI use in hiring and requires employer notification. These laws are significant, but they share a common focus on bias and discrimination. None of them address the FCRA’s distinct consumer reporting framework: the disclosure, authorization, accuracy, and adverse action requirements that apply when a third party assembles information about consumers and furnishes it to employers for hiring decisions.

The enforcement landscape adds another layer of uncertainty. The CFPB, which was actively issuing guidance supporting consumer protection in adjacent areas like the P2P payment space, is operating under very different leadership today. The withdrawal of the October 2024 circular on AI hiring tools signals a different posture. Whether the CFPB will weigh in on this issue going forward, and if so, on which side, is unclear. That reality makes the courts, rather than the regulatory apparatus, the more likely arena where the application of the FCRA to AI hiring tools will be resolved.

What This Means for Job Applicants

If you have applied for jobs online, particularly at large companies, there is a meaningful chance that an AI hiring tool was involved in evaluating your application, whether you knew about it or not. Here is what you should understand.

The FCRA’s employment protections do not depend on the applicant knowing the system exists. The burden is on the employer and the consumer reporting agency to comply: to provide disclosures, obtain authorization, follow accuracy procedures, and issue adverse action notices. If those steps were not taken, the violation occurred regardless of whether the applicant was aware of it at the time.

If you applied for a job and noticed that the application URL contained a third-party platform’s name (such as “eightfold.ai”) that may indicate your application was processed through an external AI system. If you were not hired and never received a copy of any report used in the decision or a summary of your rights under the FCRA, the adverse action process may not have been followed.

The damages available under the FCRA for employment violations include lost wages from the job you did not get, emotional distress, statutory damages of $100 to $1,000 per willful violation, punitive damages, and attorney’s fees. Under California’s ICRAA (the state-law claim also raised in the Kistler complaint) the statutory floor is the greater of actual damages or $10,000, plus attorney’s fees and potential punitive damages. And because the FCRA provides for attorney’s fees to be paid by the defendant, these cases can be brought on a contingency basis without out-of-pocket cost to the applicant.

What to Watch

The Kistler v. Eightfold AI case is in its early stages. Eightfold has denied key allegations, including that it scrapes social media data, and has stated that candidates can view and correct their data. How the court addresses the threshold question (whether Eightfold functions as a consumer reporting agency under the FCRA) will be significant for the entire AI hiring industry. If the answer is yes, other AI hiring platforms engaged in similar practices face the same exposure. FCRA lawsuits have been increasing sharply: filings rose 125% between 2014 and 2024, and 1,681 FCRA cases were filed in the first quarter of 2024 alone.

The broader question is how courts will apply a statute enacted in 1970 to technologies that did not exist when Congress wrote it. The FCRA’s drafters could not have imagined AI tools processing billions of data points to generate predictive scores about job applicants’ personalities. But the statute’s core requirements (transparency, accuracy, consent, and notice) were written broadly enough that they may not need to be rewritten to reach this new context. That is the question the Kistler case and others that will follow it will answer.

Contact Schlanger Law Group

Schlanger Law Group has represented victims of credit reporting errors and consumer protection violations since 2007. FCRA claims are one of our core practice areas. We typically represent victims on a contingency fee basis and handle cases nationwide. If you believe an AI hiring tool was used to evaluate your job application without your knowledge or consent, contact us today to discuss your options.

Share this Article

More to Explore