Introduction

In February 2023 [5], the US District Court for the Northern District of California granted preliminary collective certification in an age-discrimination case involving Workday [6], Inc.’s AI-based hiring tool [8]. This case [1] [2] [5] [7] [8], initiated by Derek Mobley and joined by other plaintiffs [3], highlights the potential for systematic discrimination in AI-driven hiring processes, particularly against older, Black [1] [3], and disabled job seekers [1].

Description

In February 2023 [5], the US District Court for the Northern District of California granted preliminary collective certification under the Age Discrimination in Employment Act (ADEA) to a group of job applicants in an age-discrimination case involving an AI-based hiring recommendation tool from Workday, Inc. The lawsuit [1] [3] [4] [5] [7], initiated by Derek Mobley and joined by four other plaintiffs [3], alleges systematic discrimination against job seekers aged 40 and over, as well as against Black and disabled individuals. The plaintiffs claim they have faced over 100 job rejections since 2017 due to the biased outcomes of Workday’s AI-driven applicant screening tools, often receiving automated rejections shortly after applying [3], which they argue indicates a lack of human review [3]. The class includes individuals aged 40 and older who were denied job recommendations through Workday’s platform from September 24, 2020 [4], to the present [4]. They assert violations of the ADEA, Title VII of the Civil Rights Act [1] [5], and the ADA Amendments Act (ADAAA) [5]. The plaintiffs contend that claims of disparate impact from algorithmic hiring systems can be collectively treated [2] [8], even when individual applicants sought different roles at various companies that may have utilized the AI features in diverse ways [2]. To establish a claim of disparate impact under the ADEA [4], plaintiffs must demonstrate a significant adverse effect on the protected age group [4], identify the relevant employment practices [4], and show a causal link between those practices and the impact [4].

The court’s ruling allows the case to proceed to class discovery, enabling plaintiffs to notify similarly situated individuals to opt in and have their claims heard collectively [2]. Judge Rita Lin’s decision also dismissed Mobley’s claim that Workday acted as an employment agency but permitted the possibility of liability under agency theory to continue. The Equal Employment Opportunity Commission (EEOC) supported this position [1] [5], asserting that Workday could be considered an agent of its client-employers [5], thus falling under the definition of an employer for discrimination laws [5]. This decision underscores that delegating hiring functions to AI does not exempt employers from liability for discrimination [5].

Legal experts emphasize the implications of this case for labor law, as it establishes a precedent for holding firms accountable for discrimination when using AI in hiring [5]. The historical development of agency theory in employment discrimination law indicates that entities influencing employment decisions can be held liable [5], as seen in landmark cases like Association of Mexican-American Educators v [5]. California and Williams v [5]. City of Montgomery [5]. However, challenges remain in proving discrimination in AI-driven hiring [5], particularly under the disparate impact doctrine [5], which requires plaintiffs to demonstrate that an algorithm disproportionately affects protected groups [5]. While the Mobley ruling allows claims to proceed [5], establishing disparate impact is complex and often necessitates extensive evidence [5]. Employers may also invoke the business necessity defense [5], arguing that discriminatory practices are essential for their operations [5].

Some scholars propose interpreting Title VII discrimination through a negligence framework [5], which does not require intent to discriminate [5]. However, the opaque nature of AI complicates establishing negligence [5], as it is difficult to foresee algorithmic harm [5]. Legislative efforts [5], such as New York City’s algorithmic bias audit law [5], aim to address these issues by mandating audits and transparency for automated employment decision tools [5], though enforcement remains problematic [5].

Conclusion

Ultimately, the Mobley case represents a significant but limited advancement in addressing AI liability in employment discrimination [5]. While it reinforces that employers and AI vendors cannot evade liability by outsourcing hiring decisions to automated systems [5], it does not clarify the standards for proving discrimination [5]. The ruling emphasizes the need for accountability in the use of AI in hiring [5], aligning with the principles of equal opportunity established by civil rights legislation [5]. Employers are advised to carefully evaluate the filters and scoring methods used in automated screening processes [7], particularly those that are less visible in applicant tracking systems [7], to ensure fair hiring decisions and the ability to justify those decisions [7].

References

[1] https://news.bloomberglaw.com/litigation/workday-ai-bias-suit-to-go-forward-as-age-claim-class-action
[2] https://www.jdsupra.com/legalnews/ai-screening-tools-under-scrutiny-9480130/
[3] https://www.mercurynews.com/2025/05/22/workdays-hiring-tech-prevented-people-from-getting-hired/
[4] https://www.hrdive.com/news/workday-ai-bias-lawsuit-class-collective-action/748518/
[5] https://hulr.org/spring-2025/ai-as-an-employment-agent-what-mobley-v-workday-addresses-and-what-it-doesnt
[6] https://www.law360.com/articles/2344393/collective-cert-in-age-bias-suit-shows-ai-hiring-tool-scrutiny
[7] https://www.linkedin.com/pulse/when-ai-meets-employment-law-latest-mobley-v-workday-ben-hawkes-qrgte/
[8] https://privacy-daily.com/news/2025/05/22/Calif-Court-Ruling-on-Age-Discrimination-May-Impact-AI-Hiring-Technology-Lawyers-Say-2505220027