Federal regulators urge HR to perform AI bias audits
The EEOC's latest warning about AI bias in hiring might prompt employers to conduct AI audits, either done internally or with independent third parties.
Listen to this article. This audio was generated by AI.
The U.S. Equal Employment Opportunity Commission on Thursday said employers relying on AI hiring tools would be liable for biased hiring. It's a warning that might prompt broader adoption of AI audit laws, similar to New York City's new automated employment decision law.
If discrimination in hiring occurs, employers could be held responsible for actions taken by their HR software vendors "if the employer has given them authority to act on the employer's behalf," the EEOC said in its advisory.
Federal regulators have not brought any lawsuits against HR vendors or employers for AI bias. The law the government is enforcing is in the 1964 Civil Rights Act, which prohibits discrimination against people based on race, color, national origin, religion or sex, including pregnancy or sexual orientation. The risks of bias in AI hiring tools have been cited in studies and hearings.
The EEOC has regularly warned employers about the risks of AI, even suggesting that some AI vendors were selling "snake oil." Its latest action outlined what steps employers should take to ensure they aren't discriminating in hiring. It recommends that employers ask the HR vendor about the discrimination risk.
Until lawsuits are filed over AI bias and the courts have ruled, the liability issues between employers and vendors will be unclear. But at this stage, the EEOC is focusing on urging HR vendors and employers to equally ensure the software doesn't discriminate.
Evelyn McMullenAnalyst, Nucleus Research
The EEOC did not mention AI audits specifically, but "it's implied," said Evelyn McMullen, an analyst at Nucleus Research.
McMullen believes the EEOC guidance could prompt states to follow New York City's example and adopt laws that require employers to conduct independent audits to check for AI bias. This law also requires employers to notify job seekers when these tools are used.
NYC's 2021 law is the first in the nation. The law was due to take effect in January, but was delayed until July 5, while regulators and vendors sparred over the final regulatory language.
"Because the liability falls on the employer, it's important to bring in an independent auditor that specializes in AI bias," McMullen said. "It's likely that the first financial penalties will fall on employers that fail to show that they at least attempted to address bias in algorithms."
Self-audit or independent audit
In a statement, EEOC Chair Charlotte Burrows said, "I encourage employers to conduct an ongoing self-analysis to determine whether they are using technology in a way that could result in discrimination."
James Paretti, an employment attorney at Littler Mendelson in Washington, said employers could conduct a self-audit or an independent audit.
"If a state or locality or other law requires an independent, third-party analysis, that's what must be done," Paretti said in an email. But in the absence of such a requirement, an in-house assessment or audit, where the employer has the capacity to do one, "is a best practice."
In an analysis shortly following the EEOC's guidance statement, Paretti wrote that employers "are advised to keep a close eye on developments, as both the federal government and state and local governments have indicated an intent to regulate in this space."
McMullen believes California will follow NYC in adopting an automated employment decision law. "This should lead to a domino effect until most of the U.S. is covered and independent audits become common practice," she said.
Patrick Thibodeau covers HCM and ERP technologies for TechTarget Editorial. He's worked for more than two decades as an enterprise IT reporter.