White-sounding names have edge in hiring, bias study says

A recent study exposes ongoing racial bias in hiring practices, showing that white-sounding names are often preferred, with a notable overall discrimination rate of 10%.

Researchers sent fake job applications to about 100 Fortune 500 employers, with up to 1,000 applications per employer -- totaling about 84,000 fictitious resumes. They randomly assigned Black- or white-sounding names to single out the race of employees to search for hiring bias. Many employers favored white candidates over Black candidates.

The overall rate of racial discrimination was about 10%, although there was wide variation among companies, with some favoring white applicants over Black applicants by more than 20%, the study found.

Hired CEO Josh Brenner said the findings on hiring bias in this working paper, updated this month, "highlight that bias is still an unfortunate reality in hiring."

Rollbacks of affirmative action and growing anti-DEI sentiment threaten to exacerbate the [hiring bias] problem.
Josh BrennerCEO, Hired

Brenner believes hiring bias could worsen because "rollbacks of affirmative action and growing anti-DEI sentiment threaten to exacerbate the problem and adversely impact an organization's ability to hire diverse talent," he said. Hired, a recruiting marketplace for tech jobs, has been steadily documenting tech industry hiring trends in its annual "State of Wage Inequality in the Tech Industry" report.

But one of the questions facing HR software vendors and the managers who rely on their platforms isn't just about discrimination in hiring -- it's also about whether racial and ethnic bias has been baked into AI systems used to sort and rank job candidates. This study, which relies on data collected in 2020 and 2021 as DEI and automation efforts were gaining ground, doesn't offer an answer.

Patrick Kline, an economist at the University of California, Berkeley, and one of the study's authors, said he did not believe automation played a major role in the bias the researchers observed. But the study does cite the benefit to centralized hiring processes as "a possible means of dampening bias in large organizations" versus hiring systems that might be more susceptible to "snap judgments by individuals."

In response to questions from TechTarget Editorial, Kline said they saw a high response rate of about 25% to the fictitious resumes sent out, which suggested that a significant number of resumes were reviewed rather than automatically screened.

Kline also said it's "extremely unlikely that any automated screen is actually using the names" to reject applicants. The study randomly assigned names suggesting that the applicant was Black or white to resumes while keeping other characteristics, such as work history and education, consistent.

Nearly a dozen HR platforms

In a previous version of the paper from 2021, the researchers found that companies used 11 different third-party services to manage some aspects of their hiring process. Despite the variety of vendors involved, the study found little difference from one company to the next in racial contact gaps, which measured the difference in response rates between job applications that appeared to come from candidates of different racial backgrounds.

This month, the researchers released the names and ratings of the companies that unwittingly participated in the research, including auto parts suppliers such as Auto Nation, categorized by researchers as more discriminatory, and general merchandiser Target, which was among the best scoring organizations.

Donald Tomaskovic-Devey, a sociology professor who heads the Center for Employment Equity at the University of Massachusetts Amherst, welcomed the decision to release the employers' names in the study. He noted that the researchers have also developed a method for checking bias that could be valuable to regulators, such as the U.S. Equal Employment Opportunity Commission.

Centralized vs. decentralized HR

However, the question of AI and how it might have influenced the study's findings remains unclear.

Tomaskovic-Devey raised questions about automation's role, suggesting that it could have a more significant effect than the researchers realize. He said there is potential for names to trigger screening decisions since algorithms might screen for names along with other correlated variables.

Yet Tomaskovic-Devey also acknowledged that the research might correctly indicate that "the bigger problem is with people making unmonitored decisions in decentralized systems." He noted that the perception of lower rates of racial bias in more centralized HR systems -- or possibly in centralized vendor screenings -- suggests that these systems could reduce bias more effectively than individual hiring managers.

Kline said data for the most recent study was collected in 2020 and 2021, before the widespread deployment of generative AI chatbots in hiring portals. However, Tomaskovic-Devey noted that machine learning algorithms were already in place at that time, which could have influenced the hiring process.

Hired's Brenner said that to combat bias, organizations must prioritize action from the leadership level, which includes comprehensive training on unconscious bias awareness and mitigation techniques for hiring managers.

"Fostering a culture where all employees are educated on recognizing and reporting bias, even anonymously, is crucial," he said.

Patrick Thibodeau is an editor at large for TechTarget Editorial who covers HCM and ERP technologies. He's worked for more than two decades as an enterprise IT reporter.

Dig Deeper on Talent management

SearchSAP
SearchOracle
Business Analytics
Content Management
Sustainability
and ESG
Close