Kindness
Build better, webflow better

Templates & courses to help you build sites that get attention.

Go back

Workday, AI, and Employment Discrimination: What Employers and Candidates Need to Know

Career
August 16, 2025

Workday, AI, and Employment Discrimination: What’s Really at Stake

AI has crept into almost every corner of hiring—from resume screening to interview scheduling and candidate ranking. That convenience is now under a legal microscope. The highest-profile example: Mobley v. Workday, a putative class/collective action alleging that Workday’s AI-enabled screening tools led to discrimination based on race, age, and disability. In July 2024, a federal judge in California allowed key claims to move forward; in May 2025, the court conditionally certified an Age Discrimination in Employment Act (ADEA) collective, expanding the case’s potential scope. ReutersGovInfoProskauer

Why this matters: The case tests when software vendors—not just employers—can face liability for discriminatory outcomes. The court indicated Workday could potentially be liable as an “agent” of employer-clients under federal anti-discrimination laws, a novel theory with big implications for HR tech. ReutersWorkforce Bulletin

Workday has denied the accusations and argued that it is not the employer and that its tools are not designed to infer protected characteristics. Still, the litigation is proceeding, and discovery and certification battles are ongoing. HR DiveFortune

The Legal Landscape (Fast)

  • Title VII, ADA, ADEA apply to AI, too. The EEOC has repeatedly warned that employers are responsible for outcomes when using algorithmic tools—even if a vendor built them. If a tool causes disparate impact, employers must show the practice is job-related and consistent with business necessity. Law and the Workplace
  • NYC Local Law 144 (AEDT). If you use automated employment decision tools for NYC-linked jobs, you must complete a bias audit, post a summary, and notify candidates. Noncompliance can trigger complaints and enforcement. NYC.gov+1
  • Vendor liability is evolving. In the Workday case, the court allowed claims to proceed on an agency theory even though Workday is not the employer—a signal that courts may scrutinize the design, training data, and deployment of hiring algorithms. ReutersWorkforce Bulletin

What “Bias” Looks Like in Practice

Algorithmic bias can creep in through:

  • Training data that reflect historical patterns (e.g., age-skewed workforces).
  • Feature selection that proxies for protected traits (e.g., graduation year as an age signal).
  • Optimization targets (e.g., “time-to-productivity”) that disfavor non-linear careers or disability accommodations.

Courts and regulators increasingly focus on outcomes (selection rates), not just intentions or code. Law and the Workplace

For Employers Using Workday (or Any HR Tech)

Use this practical, defensible-by-design checklist:

  1. Map decisions: Document where automated tools influence pass/fail gates, rankings, or interview invites. Keep human review in the loop for edge cases. Law and the Workplace
  2. Run impact tests: Measure selection rate ratios (e.g., 4/5ths rule) across protected classes before go-live and at least annually—or when job requirements change. Keep scripts, data, and results. NYC.gov
  3. Bias audits & transparency: If NYC-linked, complete an independent bias audit, post summaries, and notify candidates per Local Law 144. Even outside NYC, adopt this as a best practice. NYC.gov+1
  4. Tighten vendor contracts: Add language requiring pre-deployment testing, ongoing monitoring, explanations for adverse actions, cooperation with EEOC/state agencies, and timely bug/bias remediation. (The agency-liability posture makes this vital.) ReutersWorkforce Bulletin
  5. Accessibility & accommodations: Provide non-AI alternatives upon request; ensure tools are accessible (e.g., screen reader compatibility) and don’t penalize accommodations. Law and the Workplace
  6. Documentation discipline: Keep records that link each selection criterion to job necessity and show you considered less discriminatory alternatives if disparities appear. Law and the Workplace

For Candidates

  • Ask for accommodations if an online assessment disadvantages you; employers must consider alternatives.
  • Keep detailed logs of applications and rejections. If you suspect automated screening, note timestamps and any boilerplate language.
  • Know local rules (e.g., NYC) that may give you notice rights or public bias-audit summaries. NYC.gov

What to Watch Next

  • Class/collective certification fights and discovery in the Workday case (including how the tools were trained, tuned, and evaluated). GovInfo
  • EEOC enforcement trends around algorithmic disparate impact and employer responsibility for vendor tools. Law and the Workplace
  • State & city laws copying or expanding on NYC’s bias-audit framework. NYC.gov

Note: This article is informational and not legal advice. If you use automated hiring tools, consult counsel to tailor a compliance program to your jurisdiction and risk profile.

Sources & Further Reading

  • Reuters: Court lets bias lawsuit against Workday proceed (July 2024). Reuters
  • N.D. Cal. Order granting preliminary collective certification (May 16, 2025). GovInfo
  • Proskauer / Fisher Phillips explainers on the 2025 certification rulings. ProskauerFisher Phillips
  • EEOC technical guidance on AI and Title VII. Law and the Workplace
  • NYC Department of Consumer and Worker Protection: Automated Employment Decision Tools (Local Law 144) + FAQ. NYC.gov+1