Getty Images/Tetra images RF

Workday's AI lawsuit defense puts responsibility on users

Workday faces an AI lawsuit that's raising questions about HR software responsibility and liability in AI-driven recruitment. A federal judge is considering a dismissal motion.

An ongoing AI lawsuit against Workday highlights potential positions HR software vendors might adopt when critics accuse their products of facilitating hiring bias. Central to this legal battle is a contentious issue: Should the HR SaaS platform vendor or its users bear the responsibility when facing a bias claim?

Workday is defending itself in a lawsuit filed by Derek Mobley in U.S. District Court in Oakland, Calif., last year. Mobley alleges that he has applied for 80 to 100 jobs through various employers, all of which he believes use Workday's hiring software. Despite Mobley holding finance and network system administration degrees, employers consistently rejected him, which he claims stemmed from biases in the software.

Workday's defense strategy demarcates the line of responsibility, arguing that the users, not Workday, control the hiring process.

Mobley's AI lawsuit is now at a critical juncture as it awaits a judge's ruling on Workday's motion to dismiss.

Mobley, who identifies himself in the filing as African American and over age 40 with anxiety and depression, alleges that Workday's software is discriminatory. Numerous positions required him to take "a Workday branded assessment and/or personality test" when applying. The lawsuit argues that these tests "are unlawful disability related inquiries" designed to identify mental health disorders that have no bearing on whether he would succeed as an employee.

In January, U.S. District Judge Rita Lin dismissed Mobley's lawsuit against Workday, but granted Mobley the opportunity to file an amended complaint to pursue additional legal theories raised at a dismissal hearing and in a brief. Seizing this chance, Mobley's legal team expanded their case. While the original complaint spanned 16 pages, the amended version presented a more detailed argument in 37 pages.

Among the legal theories advanced by Mobley is that Workday acts as an indirect employer. Its hiring tools "discriminatorily interfere" in hiring, an allegation that implies they act as gatekeepers. The lawsuit also cites TechTarget Editorial reporting on Workday's workforce composition to support its narrative that Workday's "algorithmic decision-making tools lack sufficient guardrails to prevent discrimination."

Workday denies claims

In its new dismissal motion this month, Workday vehemently denied those claims, stating that it's the customers who configure and use its software to screen candidates. Workday added that it has no control over a customer's day-to-day operations and "no ability to force a customer to make decisions in the hiring process or otherwise."

In a statement to TechTarget Editorial, Workday said: "We believe this lawsuit is without merit and deny the allegations and assertions made in the amended complaint. We remain committed to responsible AI." Workday defines responsible AI as having "guardrails to ensure fairness, transparency, explainability, reliability, and more."

Liability is ultimately going to be borne by the employer.
Paul LopezLabor and employment attorney, Tripp Scott

Workday's defense underscores the situation an HR manager could face when using any HR software platform.

Under employment law, "liability is ultimately going to be borne by the employer," said Paul Lopez, a labor and employment attorney at Tripp Scott in Fort Lauderdale, Fla. But that's not to say the HR vendor is off the hook. He said the employer could argue that the software was defective, and the software vendor might have some responsibility to customers.

Lopez said employers will need indemnification in software licensing agreements, where a software provider takes on the cost of third-party claims against a customer, in case of a problem.

Customers, especially smaller companies, might not have the leverage to get the software licensing agreements that they want. If that's the case, Lopez said, he might recommend staying away from a platform until "you have seen other companies use it and kick the tires over and over again without incident or problem."

Software license questions

Dean Rocco, co-chair of the employment and labor practice at law firm Wilson Elser in Los Angeles, pointed to liability issues around electronic payroll system disputes to guide what might happen with AI lawsuits.

Most providers of payroll tools include terms in their agreements that shift the responsibility of ensuring legal compliance to the employer or user. He said courts have also resisted efforts to hold the technology providers liable.

In contracts with the payroll providers, the customers are "acknowledging upfront that all we're doing is providing you a technology platform," Rocco said.

But the bar for providers and users of AI software will likely be higher. The U.S. Equal Employment Opportunity Commission has repeatedly warned employers that it's watching for AI bias in employment. Last year, it settled its first case over software that automatically rejected applicants due to age.

For employers, the risk of using AI in recruiting, hiring and other HR functions "is significant enough to warrant ongoing oversight, reviews and evaluation of AI solutions that are being put in place," Gartner analyst Helen Poitevin said.

Jamie Kohn, an analyst at Gartner, said recruiting managers "aren't always aware of how vendors use AI in their products," and growing legal requirements "mean they need to develop a deeper understanding of the technologies they use and know what questions to ask."

"If you implement it, you are responsible for how it impacts your hiring decisions," he said.

But another question to consider, according to Katy Tynan, an analyst at Forrester Research, is "What opportunities does AI present to mitigate bias and potentially reduce the risk of human bias?"

For instance, AI could be used to identify bias in language or behavior patterns that's difficult for humans to see, she said.

However, as AI is integrated into many aspects of the talent lifecycle, Tynan said, the scale "increases the probability that the technology will be used in some way that results in a lawsuit."

Patrick Thibodeau is an editor at large for TechTarget Editorial who covers HCM and ERP technologies. He's worked for more than two decades as an enterprise IT reporter.

Dig Deeper on Talent management

SearchSAP
SearchOracle
Business Analytics
Content Management
Sustainability
and ESG
Close