The Wagner Law Group | Est. 1996

Sophisticated Legal Solutions And Boutique-Style Service

When Algorithms Discriminate: Two Landmark Cases That Redefine Employer Liability in AI Hiring

by | Jun 25, 2025 |

By Joshua Cook

As employers begin to embrace Artificial Intelligence (“AI”), offloading tasks such as resume screening and initial interviews, they open themselves up to new legal scrutiny. The utility of AI for the employer is indisputable. An employer can, for instance, sift through thousands of applicants by explaining to an algorithm what the employer is looking for in a hire. AI promises to allow an employer to find the candidate whose background most closely aligns with the job description.  It seems almost too good to be true, because it is.  AI frequently injects at least the appearance of discrimination which warrants a cautious approach by employers.

On May 16, 2025, Judge Rita Lin of the US District Court for the Northern District of California gave the green light to a collective action suit in the groundbreaking Mobley v. Workday Inc. (2025). Several individuals over the age of 40, led by Derek Mobley, claim to have applied for hundreds of jobs with companies that use Workday’s AI-based hiring tools. These tools utilize algorithms to assess a candidate’s viability for the position in question. Workday’s AI will automatically reject or advance candidates through the hiring process based on its evaluation. The plaintiffs say they were rejected every single time, alleging age discrimination under the Age Discrimination in Employment Act (“ADEA”). Initially, Workday requested that the case be dismissed, as it was not the prospective employer and, therefore, could not be held responsible for making employment decisions. A federal judge disagreed and allowed the case to proceed in July 2024. Plaintiffs then sought to certify their case as a collective action which is similar to a class action.  On May 16, 2025, the case was recognized as a collective action suit.

What makes this case especially significant is that the plaintiffs are not suing the prospective employer, but rather the vendor – Workday – for creating and distributing allegedly discriminatory AI hiring tools. Historically, such claims would be brought against the prospective employer, often requiring plaintiffs to prove discrimination anew in each instance. Here, the plaintiffs argue that the vendor itself bears responsibility, even in the absence of any contractual relationship with the candidates.

Due to the AI-driven shift in the employment landscape, employers must be diligent in AI compliance, especially given that this area of AI and employment law is ripe for legal compliance guidelines at both the federal and state levels. Mobley v. Workday Inc. is significant because it demonstrates that a vendor in the hiring process may be directly liable to candidates for discriminatory outcomes produced by its AI-enabled applicant screening tools. The case underscores an important shift: courts are increasingly willing to treat algorithmic decision-making as subject to traditional employment discrimination laws, such as the ADEA. This signals that delegating hiring decisions to AI does not shield employers from liability — it may increase risk if proper audits, oversight, and due diligence are not conducted.

To further reinforce the fact that there is a real shift in employment law to holding employers liable for discriminatory outcomes produced by third-party AI tools, the American Civil Liberties Union (“ACLU”) has filed a complaint against Intuit and HireVue on behalf of a Deaf and Indigenous woman. The complaint alleges violations under the Americans with Disabilities Act (“ADA”), Title VII of the Civil Rights Act, and the Colorado Anti-Discrimination Act.

According to the complaint, D.K.—a Deaf Indigenous woman—applied for a promotion to Seasonal Manager at Intuit after working for the company in various seasonal roles since 2019. As part of the 2024 application process, she was required to complete an asynchronous video interview through HireVue, a platform that relies on automated speech recognition and AI-driven scoring. D.K. requested a reasonable accommodation in the form of human-generated captioning, but Intuit denied her request, stating that HireVue’s built-in subtitles would be sufficient. In practice, however, portions of the interview lacked subtitles entirely, forcing her to rely on error-prone, browser-based auto-captioning, which hindered her ability to comprehend the questions fully. The ACLU’s complaint also cites research showing that automated speech recognition systems and AI scoring tools often perform worse for individuals who speak English with non-white accents, including Black, Hispanic, and Indigenous applicants, leading to disproportionately low scores and fewer opportunities for advancement.

As the landmark cases above demonstrate, the use of AI in hiring raises legal issues. AI is not immune to bias. It does not shield the employer from claims of disparate impact. Employers must ensure that, when using AI tools to facilitate their hiring process, they do not expose themselves to potential discrimination claims. Employers should audit any AI systems they use for hiring, demand clarity from vendors, and consider weighing the pros and cons of outsourcing decisions to AI algorithms in a world that has yet to lay fundamental ground rules on the matter. AI is a powerful tool for employers, but it may also be a Sword of Damocles dangling overhead.

If your organization is considering or currently using AI tools in hiring, it is important to evaluate their compliance with anti-discrimination laws and cybersecurity and privacy best practices. At The Wagner Law Group, Joshua N. Cook advises clients on AI governance, cybersecurity risk, and regulatory compliance in employment contexts. Josh brings deep expertise in privacy law, algorithmic accountability, and risk mitigation strategies. For guidance on auditing your hiring technologies, vetting third-party vendors, or developing legally defensible AI practices, please contact Joshua N. Cook for tailored assistance.

Categories

Archives