TL;DR: Job seekers are suing an AI hiring tool called Eightfold for allegedly compiling secretive reports that help employers screen candidates. Why is this illegal? The same reason credit rating agencies have to tell you why they dinged your score, the lawsuit claims. If the courts buy this logic, it could start to reshape the black-box world of AI hiring. What happened: Like many people who’ve played the job search numbers game lately, the plaintiffs were sick of applications seemingly plummeting into a void. They filed a class-action suit against Eightfold, which is used by major companies like Microsoft and PayPal for vetting potential hires. The lawsuit claims that Eightfold violated the Fair Credit Reporting Act and a similar California consumer protection law by not letting applicants view information about them and correct the record if needed. “Eightfold’s technology lurks in the background of job applications,” the lawsuit alleges, “collecting personal data, such as social media profiles, location data, internet and device activity, cookies and other tracking.” Eightfold disputes this: The tool “operates on data intentionally shared by candidates or provided by our customers. We do not scrape social media and the like,” spokesperson Kurt Foeller told us. “Eightfold believes the allegations are without merit.” What isn’t disputed is that Eightfold uses AI to produce a score between zero and five, ranking how much of a fit a candidate is for a given job. Why it matters: Companies now use a whole slew of behind-the-scenes AI tools to find and evaluate candidates. Candidates are playing the game, too, using their own AI tools to find jobs and craft applications. It’s AI all the way down. “We are at a point where AI hiring tools are being adopted very quickly, often faster than companies are building the compliance, auditing, and governance structures needed to use them responsibly,” the attorneys on the case, Jenny R. Yang and Christopher M. McNerney, partners at Outten & Golden LLP, told us in an email. “That creates real risk—not only of inaccurate decisions, but also of hidden discrimination.” Some states—and New York City—have laws governing these tools, largely focused on their potential for bias and discrimination. But AI decision-making still happens mostly without job seekers’ knowledge. This isn’t the first time that the Fair Credit Reporting Act has been used to challenge big data hiring systems, according to Pauline Kim, an employment law professor at the Washington University School of Law—but it is new for one of these cases to focus on AI. What this means for you: If the lawsuit is successful—which could take years—AI hiring tools might be more upfront about what data they collect and work harder to ensure accuracy, Kim said. But the 55-year-old law the suit relies on might also not fully capture modern usage. The real significance, according to Kim, is that companies relying on these tools would have to be more transparent about their use. “Because the law was written in an earlier era, however, even if courts apply it, it will provide only limited transparency—likely not enough to ensure the fairness of these systems.” —PK |